Mar 19 10:21:40 crc systemd[1]: Starting Kubernetes Kubelet... Mar 19 10:21:40 crc restorecon[4689]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:40 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 19 10:21:41 crc restorecon[4689]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 19 10:21:42 crc kubenswrapper[4765]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 10:21:42 crc kubenswrapper[4765]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 19 10:21:42 crc kubenswrapper[4765]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 10:21:42 crc kubenswrapper[4765]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 10:21:42 crc kubenswrapper[4765]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 19 10:21:42 crc kubenswrapper[4765]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.111399 4765 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116747 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116781 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116787 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116795 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116803 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116809 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116816 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116821 4765 feature_gate.go:330] unrecognized feature gate: Example Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116826 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116831 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116836 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116841 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116846 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116851 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116872 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116876 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116881 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116885 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116889 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116894 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116898 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116902 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116908 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116913 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116918 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116924 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116930 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116935 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116940 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116944 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116949 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116953 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116974 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116980 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116984 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116988 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116993 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.116998 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117003 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117007 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117012 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117016 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117021 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117026 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117030 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117035 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117039 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117043 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117047 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117052 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117057 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117062 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117066 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117071 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117076 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117080 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117084 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117089 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117093 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117097 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117102 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117106 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117110 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117114 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117119 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117124 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117129 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117133 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117137 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117144 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.117150 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117249 4765 flags.go:64] FLAG: --address="0.0.0.0" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117262 4765 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117273 4765 flags.go:64] FLAG: --anonymous-auth="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117280 4765 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117287 4765 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117292 4765 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117300 4765 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117306 4765 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117311 4765 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117316 4765 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117322 4765 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117327 4765 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117333 4765 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117338 4765 flags.go:64] FLAG: --cgroup-root="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117343 4765 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117351 4765 flags.go:64] FLAG: --client-ca-file="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117356 4765 flags.go:64] FLAG: --cloud-config="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117361 4765 flags.go:64] FLAG: --cloud-provider="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117366 4765 flags.go:64] FLAG: --cluster-dns="[]" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117373 4765 flags.go:64] FLAG: --cluster-domain="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117378 4765 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117384 4765 flags.go:64] FLAG: --config-dir="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117388 4765 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117394 4765 flags.go:64] FLAG: --container-log-max-files="5" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117401 4765 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117407 4765 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117412 4765 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117418 4765 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117423 4765 flags.go:64] FLAG: --contention-profiling="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117430 4765 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117435 4765 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117440 4765 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117444 4765 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117451 4765 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117456 4765 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117461 4765 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117466 4765 flags.go:64] FLAG: --enable-load-reader="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117471 4765 flags.go:64] FLAG: --enable-server="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117476 4765 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117484 4765 flags.go:64] FLAG: --event-burst="100" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117489 4765 flags.go:64] FLAG: --event-qps="50" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117494 4765 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117499 4765 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117504 4765 flags.go:64] FLAG: --eviction-hard="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117511 4765 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117516 4765 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117522 4765 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117527 4765 flags.go:64] FLAG: --eviction-soft="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117532 4765 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117537 4765 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117542 4765 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117548 4765 flags.go:64] FLAG: --experimental-mounter-path="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117553 4765 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117558 4765 flags.go:64] FLAG: --fail-swap-on="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117563 4765 flags.go:64] FLAG: --feature-gates="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117570 4765 flags.go:64] FLAG: --file-check-frequency="20s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117576 4765 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117583 4765 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117588 4765 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117593 4765 flags.go:64] FLAG: --healthz-port="10248" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117599 4765 flags.go:64] FLAG: --help="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117605 4765 flags.go:64] FLAG: --hostname-override="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117610 4765 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117615 4765 flags.go:64] FLAG: --http-check-frequency="20s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117621 4765 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117626 4765 flags.go:64] FLAG: --image-credential-provider-config="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117631 4765 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117636 4765 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117641 4765 flags.go:64] FLAG: --image-service-endpoint="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117646 4765 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117651 4765 flags.go:64] FLAG: --kube-api-burst="100" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117656 4765 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117661 4765 flags.go:64] FLAG: --kube-api-qps="50" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117666 4765 flags.go:64] FLAG: --kube-reserved="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117671 4765 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117676 4765 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117681 4765 flags.go:64] FLAG: --kubelet-cgroups="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117687 4765 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117692 4765 flags.go:64] FLAG: --lock-file="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117697 4765 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117703 4765 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117709 4765 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117718 4765 flags.go:64] FLAG: --log-json-split-stream="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117723 4765 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117728 4765 flags.go:64] FLAG: --log-text-split-stream="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117734 4765 flags.go:64] FLAG: --logging-format="text" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117739 4765 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117751 4765 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117756 4765 flags.go:64] FLAG: --manifest-url="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117762 4765 flags.go:64] FLAG: --manifest-url-header="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117770 4765 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117776 4765 flags.go:64] FLAG: --max-open-files="1000000" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117782 4765 flags.go:64] FLAG: --max-pods="110" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117788 4765 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117793 4765 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117798 4765 flags.go:64] FLAG: --memory-manager-policy="None" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117802 4765 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117808 4765 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117814 4765 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117819 4765 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117832 4765 flags.go:64] FLAG: --node-status-max-images="50" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117837 4765 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117842 4765 flags.go:64] FLAG: --oom-score-adj="-999" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117847 4765 flags.go:64] FLAG: --pod-cidr="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117852 4765 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117860 4765 flags.go:64] FLAG: --pod-manifest-path="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117865 4765 flags.go:64] FLAG: --pod-max-pids="-1" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117870 4765 flags.go:64] FLAG: --pods-per-core="0" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117876 4765 flags.go:64] FLAG: --port="10250" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117881 4765 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117886 4765 flags.go:64] FLAG: --provider-id="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117891 4765 flags.go:64] FLAG: --qos-reserved="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117896 4765 flags.go:64] FLAG: --read-only-port="10255" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117901 4765 flags.go:64] FLAG: --register-node="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117906 4765 flags.go:64] FLAG: --register-schedulable="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117913 4765 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117923 4765 flags.go:64] FLAG: --registry-burst="10" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117928 4765 flags.go:64] FLAG: --registry-qps="5" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117933 4765 flags.go:64] FLAG: --reserved-cpus="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117938 4765 flags.go:64] FLAG: --reserved-memory="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117945 4765 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117951 4765 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117986 4765 flags.go:64] FLAG: --rotate-certificates="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117992 4765 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.117999 4765 flags.go:64] FLAG: --runonce="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118004 4765 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118010 4765 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118015 4765 flags.go:64] FLAG: --seccomp-default="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118020 4765 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118026 4765 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118031 4765 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118037 4765 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118042 4765 flags.go:64] FLAG: --storage-driver-password="root" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118047 4765 flags.go:64] FLAG: --storage-driver-secure="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118053 4765 flags.go:64] FLAG: --storage-driver-table="stats" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118058 4765 flags.go:64] FLAG: --storage-driver-user="root" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118063 4765 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118069 4765 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118074 4765 flags.go:64] FLAG: --system-cgroups="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118079 4765 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118088 4765 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118093 4765 flags.go:64] FLAG: --tls-cert-file="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118098 4765 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118104 4765 flags.go:64] FLAG: --tls-min-version="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118109 4765 flags.go:64] FLAG: --tls-private-key-file="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118115 4765 flags.go:64] FLAG: --topology-manager-policy="none" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118120 4765 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118125 4765 flags.go:64] FLAG: --topology-manager-scope="container" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118130 4765 flags.go:64] FLAG: --v="2" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118138 4765 flags.go:64] FLAG: --version="false" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118185 4765 flags.go:64] FLAG: --vmodule="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118192 4765 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118198 4765 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118340 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118349 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118355 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118365 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118370 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118375 4765 feature_gate.go:330] unrecognized feature gate: Example Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118381 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118386 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118391 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118396 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118400 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118405 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118409 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118413 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118419 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118424 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118429 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118434 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118439 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118444 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118448 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118453 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118458 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118463 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118467 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118472 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118476 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118481 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118485 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118490 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118494 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118501 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118508 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118514 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118520 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118526 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118531 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118538 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118544 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118549 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118554 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118559 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118566 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118571 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118576 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118580 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118585 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118590 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118594 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118599 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118603 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118608 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118612 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118616 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118621 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118626 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118630 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118636 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118641 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118646 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118651 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118655 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118660 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118664 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118668 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118673 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118677 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118684 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118689 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118693 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.118698 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.118713 4765 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.129265 4765 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.129314 4765 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129392 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129402 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129407 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129412 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129416 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129420 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129425 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129430 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129434 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129439 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129444 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129449 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129454 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129459 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129464 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129471 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129476 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129481 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129486 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129490 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129494 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129498 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129501 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129505 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129509 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129513 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129519 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129523 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129528 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129532 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129536 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129540 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129544 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129548 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129552 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129556 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129560 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129564 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129568 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129571 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129576 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129580 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129585 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129590 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129594 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129598 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129604 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129614 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129620 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129626 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129631 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129635 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129640 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129644 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129649 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129653 4765 feature_gate.go:330] unrecognized feature gate: Example Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129657 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129661 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129666 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129673 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129678 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129682 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129687 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129691 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129695 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129699 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129704 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129708 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129712 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129717 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129721 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.129730 4765 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129892 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129905 4765 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129909 4765 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129914 4765 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129919 4765 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129924 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129930 4765 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129935 4765 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129941 4765 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129948 4765 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129953 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129981 4765 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129988 4765 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.129994 4765 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130000 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130005 4765 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130009 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130014 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130018 4765 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130022 4765 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130028 4765 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130034 4765 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130038 4765 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130043 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130048 4765 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130054 4765 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130060 4765 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130064 4765 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130068 4765 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130072 4765 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130077 4765 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130081 4765 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130085 4765 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130089 4765 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130095 4765 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130100 4765 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130106 4765 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130111 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130117 4765 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130122 4765 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130126 4765 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130131 4765 feature_gate.go:330] unrecognized feature gate: Example Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130135 4765 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130140 4765 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130144 4765 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130189 4765 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130194 4765 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130198 4765 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130202 4765 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130207 4765 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130211 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130216 4765 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130220 4765 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130225 4765 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130229 4765 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130234 4765 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130238 4765 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130243 4765 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130247 4765 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130251 4765 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130256 4765 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130260 4765 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130266 4765 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130270 4765 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130275 4765 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130279 4765 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130283 4765 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130288 4765 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130293 4765 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130297 4765 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.130302 4765 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.130309 4765 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.130500 4765 server.go:940] "Client rotation is on, will bootstrap in background" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.139525 4765 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.144157 4765 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.144312 4765 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.146150 4765 server.go:997] "Starting client certificate rotation" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.146230 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.146561 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.175494 4765 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.178055 4765 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.179788 4765 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.200149 4765 log.go:25] "Validated CRI v1 runtime API" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.233949 4765 log.go:25] "Validated CRI v1 image API" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.237683 4765 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.241901 4765 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-19-10-17-23-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.241953 4765 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:49 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.270433 4765 manager.go:217] Machine: {Timestamp:2026-03-19 10:21:42.266076545 +0000 UTC m=+0.615022167 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2800000 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:7efe4e5f-b64c-4a0b-8f3f-69f763aea23b BootID:f831093e-daf4-4112-8683-64c2fcb4a46e Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:49 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:9c:a4:b0 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:9c:a4:b0 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:33:3c:a1 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:43:6d:6f Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:43:cd:c7 Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:57:6b:d2 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:0a:27:ac:4b:eb:51 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:c2:84:94:5a:47:70 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.271003 4765 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.271431 4765 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.272302 4765 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.272614 4765 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.272662 4765 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.273011 4765 topology_manager.go:138] "Creating topology manager with none policy" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.273030 4765 container_manager_linux.go:303] "Creating device plugin manager" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.273784 4765 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.273848 4765 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.274102 4765 state_mem.go:36] "Initialized new in-memory state store" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.274328 4765 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.280524 4765 kubelet.go:418] "Attempting to sync node with API server" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.280552 4765 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.280587 4765 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.280599 4765 kubelet.go:324] "Adding apiserver pod source" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.280610 4765 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.285888 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.286001 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.286075 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.286028 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.286623 4765 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.287671 4765 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.290104 4765 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291775 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291806 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291816 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291828 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291846 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291859 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291871 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291890 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291901 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291911 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291924 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.291934 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.292788 4765 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.293385 4765 server.go:1280] "Started kubelet" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.293758 4765 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.293804 4765 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.294561 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:42 crc systemd[1]: Started Kubernetes Kubelet. Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.295639 4765 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.295933 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.295990 4765 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.296507 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.296543 4765 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.297250 4765 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.296569 4765 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.297395 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.297495 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.298414 4765 factory.go:55] Registering systemd factory Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.298455 4765 factory.go:221] Registration of the systemd container factory successfully Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.298397 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="200ms" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.298837 4765 server.go:460] "Adding debug handlers to kubelet server" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.299086 4765 factory.go:153] Registering CRI-O factory Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.299120 4765 factory.go:221] Registration of the crio container factory successfully Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.306895 4765 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.306949 4765 factory.go:103] Registering Raw factory Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.306986 4765 manager.go:1196] Started watching for new ooms in manager Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.307568 4765 manager.go:319] Starting recovery of all containers Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.310806 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.129.56.13:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e36eec0aebd50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,LastTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316218 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316336 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316351 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316363 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316376 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316389 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316398 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316411 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316425 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316436 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316446 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316458 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316468 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316482 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316493 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316506 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316517 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316529 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316541 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316553 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316565 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316579 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316594 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316607 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316621 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316634 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316651 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316666 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316682 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316696 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316711 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316728 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316740 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316752 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316813 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316825 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316838 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316849 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316862 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316876 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316888 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316899 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316911 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316922 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316934 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316945 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316975 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.316990 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317003 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317015 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317028 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317040 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317057 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317070 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317082 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317095 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317107 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317119 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317132 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317143 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317154 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317166 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317178 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317189 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317201 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317211 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317223 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317236 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317250 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317263 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317277 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317293 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317305 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317316 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317327 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317338 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317349 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317361 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317373 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317384 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317395 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317407 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317417 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317428 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317439 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317452 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317464 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317475 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317486 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317498 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317510 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317522 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317534 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317546 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317559 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317571 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317583 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317593 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317605 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317616 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317628 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317641 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317655 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317669 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317686 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317698 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317711 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317723 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317735 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317748 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317762 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317775 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317787 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317800 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317811 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317823 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317834 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317845 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317858 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317868 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317880 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317891 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317905 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317918 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317929 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317941 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317972 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317985 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.317996 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318007 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318020 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318031 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318042 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318053 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318065 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318081 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318092 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318104 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318117 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318129 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318141 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318152 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318168 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318179 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318190 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318201 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318213 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318251 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318263 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318274 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318286 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318300 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318311 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318324 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318338 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318349 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318362 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318373 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318388 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318399 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318410 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318420 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318433 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318444 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318456 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318467 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318478 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318495 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318508 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.318521 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320581 4765 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320605 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320619 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320630 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320643 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320654 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320664 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320674 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320684 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320695 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320706 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320718 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320729 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320739 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320753 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320762 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320772 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320782 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320793 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320807 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320819 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320828 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320838 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320848 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320858 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320869 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320881 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320892 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320904 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320917 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320929 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320941 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320953 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.320988 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.321000 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.321015 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.321027 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.321039 4765 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.321049 4765 reconstruct.go:97] "Volume reconstruction finished" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.321057 4765 reconciler.go:26] "Reconciler: start to sync state" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.328569 4765 manager.go:324] Recovery completed Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.339496 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.341385 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.341462 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.341483 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.342902 4765 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.342928 4765 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.342970 4765 state_mem.go:36] "Initialized new in-memory state store" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.351518 4765 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.353949 4765 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.354782 4765 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.354819 4765 kubelet.go:2335] "Starting kubelet main sync loop" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.354869 4765 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.361669 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.361873 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.368862 4765 policy_none.go:49] "None policy: Start" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.369673 4765 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.369698 4765 state_mem.go:35] "Initializing new in-memory state store" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.396894 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.412863 4765 manager.go:334] "Starting Device Plugin manager" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.412915 4765 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.412929 4765 server.go:79] "Starting device plugin registration server" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.413431 4765 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.413452 4765 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.413574 4765 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.413668 4765 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.413683 4765 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.420018 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.455041 4765 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.455153 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.456472 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.456508 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.456520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.456646 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.456788 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.456819 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457367 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457393 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457405 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457530 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457534 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457672 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.457703 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.459026 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.459079 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.459111 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.459264 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.459307 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.459322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.459389 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.460141 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.460191 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.460618 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.460650 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.460663 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.460864 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.461074 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.461119 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.461329 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.461378 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.461390 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.462417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.462443 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.462454 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.462790 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.462817 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.462828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.463006 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.463035 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.464292 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.464314 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.464322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.499224 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="400ms" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.513885 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.515211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.515273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.515296 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.515333 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.515867 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.13:6443: connect: connection refused" node="crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523364 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523400 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523425 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523448 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523469 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523490 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523532 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523625 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523683 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523736 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523779 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523818 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523857 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523898 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.523935 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.624699 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.624818 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.624845 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.624870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.624916 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.624946 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.624932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625010 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625059 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625025 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625081 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625087 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625104 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625129 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625129 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625155 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625167 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625187 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625173 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625243 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625275 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625301 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625306 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625347 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625350 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625328 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625355 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625394 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625209 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.625526 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.716496 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.717756 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.717794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.717805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.717833 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.718321 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.13:6443: connect: connection refused" node="crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.784540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.808806 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.821022 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.828000 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.828354 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-a67324a940625050ab23cd1e454d128a1282c2337ec0c560c8ef4f01ae4837bc WatchSource:0}: Error finding container a67324a940625050ab23cd1e454d128a1282c2337ec0c560c8ef4f01ae4837bc: Status 404 returned error can't find the container with id a67324a940625050ab23cd1e454d128a1282c2337ec0c560c8ef4f01ae4837bc Mar 19 10:21:42 crc kubenswrapper[4765]: I0319 10:21:42.832020 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.842623 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-6e5c7d81b1aa0d11a5e1886e69422d927ba4d10189ae664dd0479e4b38561fec WatchSource:0}: Error finding container 6e5c7d81b1aa0d11a5e1886e69422d927ba4d10189ae664dd0479e4b38561fec: Status 404 returned error can't find the container with id 6e5c7d81b1aa0d11a5e1886e69422d927ba4d10189ae664dd0479e4b38561fec Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.857430 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-04c56344c74f6f9a3da73ce9b10a2d39a2ab19d9fbca2911c293a1a62c2ea346 WatchSource:0}: Error finding container 04c56344c74f6f9a3da73ce9b10a2d39a2ab19d9fbca2911c293a1a62c2ea346: Status 404 returned error can't find the container with id 04c56344c74f6f9a3da73ce9b10a2d39a2ab19d9fbca2911c293a1a62c2ea346 Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.857705 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-7201ea5e75127239df52afae6104feb9b9d39e56ef2a717a26b7fbf00a501c21 WatchSource:0}: Error finding container 7201ea5e75127239df52afae6104feb9b9d39e56ef2a717a26b7fbf00a501c21: Status 404 returned error can't find the container with id 7201ea5e75127239df52afae6104feb9b9d39e56ef2a717a26b7fbf00a501c21 Mar 19 10:21:42 crc kubenswrapper[4765]: W0319 10:21:42.862368 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-d34e59c09f8a61cbb5065af60539585075f627a64ebb2d4b0296011bb1ada226 WatchSource:0}: Error finding container d34e59c09f8a61cbb5065af60539585075f627a64ebb2d4b0296011bb1ada226: Status 404 returned error can't find the container with id d34e59c09f8a61cbb5065af60539585075f627a64ebb2d4b0296011bb1ada226 Mar 19 10:21:42 crc kubenswrapper[4765]: E0319 10:21:42.900841 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="800ms" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.119285 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.120549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.120600 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.120613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.120642 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:21:43 crc kubenswrapper[4765]: E0319 10:21:43.121145 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.13:6443: connect: connection refused" node="crc" Mar 19 10:21:43 crc kubenswrapper[4765]: W0319 10:21:43.273827 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:43 crc kubenswrapper[4765]: E0319 10:21:43.273914 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:43 crc kubenswrapper[4765]: W0319 10:21:43.287856 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:43 crc kubenswrapper[4765]: E0319 10:21:43.287934 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.295383 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.360357 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"a67324a940625050ab23cd1e454d128a1282c2337ec0c560c8ef4f01ae4837bc"} Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.361050 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"d34e59c09f8a61cbb5065af60539585075f627a64ebb2d4b0296011bb1ada226"} Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.361821 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"7201ea5e75127239df52afae6104feb9b9d39e56ef2a717a26b7fbf00a501c21"} Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.362429 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"04c56344c74f6f9a3da73ce9b10a2d39a2ab19d9fbca2911c293a1a62c2ea346"} Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.363052 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6e5c7d81b1aa0d11a5e1886e69422d927ba4d10189ae664dd0479e4b38561fec"} Mar 19 10:21:43 crc kubenswrapper[4765]: W0319 10:21:43.380759 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:43 crc kubenswrapper[4765]: E0319 10:21:43.380840 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:43 crc kubenswrapper[4765]: E0319 10:21:43.702168 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="1.6s" Mar 19 10:21:43 crc kubenswrapper[4765]: W0319 10:21:43.887451 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:43 crc kubenswrapper[4765]: E0319 10:21:43.887524 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.922167 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.924089 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.924135 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.924145 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:43 crc kubenswrapper[4765]: I0319 10:21:43.924177 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:21:43 crc kubenswrapper[4765]: E0319 10:21:43.924705 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.13:6443: connect: connection refused" node="crc" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.296242 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.369004 4765 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c" exitCode=0 Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.369115 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.369152 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.370288 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.370333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.370352 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.370734 4765 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593" exitCode=0 Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.370784 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.370847 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.371703 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.371735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.371745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.375086 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.375079 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"86f2e0fb32cf6d35487d16c0464a828889fefc55528da6b8de5f2d70fd3273e0"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.375144 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"3a87713adc9f4b89e52a3a22705557d52f4258ca4d042b27d0fc0826ae5dbd02"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.375170 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"0f8f655b4daf3b924b5dd3a003a3c3a3bbd03fd6d71a22dca1a0304d621da9c9"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.375190 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.376261 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.376300 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.376316 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.376953 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.377658 4765 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f" exitCode=0 Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.377719 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.377809 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:44 crc kubenswrapper[4765]: E0319 10:21:44.377823 4765 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.379036 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.379064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.379076 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.380300 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c" exitCode=0 Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.380340 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c"} Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.380462 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.381343 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.381377 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.381391 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.388833 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.390897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.390938 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:44 crc kubenswrapper[4765]: I0319 10:21:44.390948 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:45 crc kubenswrapper[4765]: W0319 10:21:45.013256 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:45 crc kubenswrapper[4765]: E0319 10:21:45.013358 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:45 crc kubenswrapper[4765]: W0319 10:21:45.253043 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:45 crc kubenswrapper[4765]: E0319 10:21:45.253137 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.296062 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:45 crc kubenswrapper[4765]: E0319 10:21:45.303750 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="3.2s" Mar 19 10:21:45 crc kubenswrapper[4765]: W0319 10:21:45.381097 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:21:45 crc kubenswrapper[4765]: E0319 10:21:45.381162 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.384706 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.384704 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.384925 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.384949 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.385367 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.385401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.385411 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.387408 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.387452 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.387465 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.387474 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.389171 4765 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552" exitCode=0 Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.389207 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.389319 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.389917 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.389946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.389969 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.390584 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d"} Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.390620 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.390727 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.395116 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.395146 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.395157 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.395166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.395193 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.395204 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.525460 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.526662 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.526691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.526700 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.526728 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:21:45 crc kubenswrapper[4765]: E0319 10:21:45.527104 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.129.56.13:6443: connect: connection refused" node="crc" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.981245 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:45 crc kubenswrapper[4765]: I0319 10:21:45.991300 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.395009 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"290b15e4e278b1564b630a6c59533011e6a6a0e3346141a3c2b36038cfabc144"} Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.395026 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.395914 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.395941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.395949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.397734 4765 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d" exitCode=0 Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.397802 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.397834 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.397910 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398336 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d"} Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398389 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398408 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398605 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398790 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.398814 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.399053 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.399098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.399117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.399164 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.399179 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:46 crc kubenswrapper[4765]: I0319 10:21:46.399188 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.408715 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403"} Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.408853 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.409546 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.408879 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.409865 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.409949 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529"} Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.410038 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c"} Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.410083 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e"} Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.410106 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6"} Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.410008 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.411210 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.411274 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.411297 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.412005 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.412054 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.412072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.412201 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.412235 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.412252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:47 crc kubenswrapper[4765]: I0319 10:21:47.711483 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.411458 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.411521 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.411520 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.412753 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.412786 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.412796 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.412753 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.412887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.412897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.546726 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.727651 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.729081 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.729154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.729174 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.729214 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:21:48 crc kubenswrapper[4765]: I0319 10:21:48.741357 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.414431 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.414485 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.415338 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.415382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.415401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.666045 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.666273 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.667439 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.667471 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.667480 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:49 crc kubenswrapper[4765]: I0319 10:21:49.837652 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.417047 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.418122 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.418213 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.418234 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.786448 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.786644 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.786696 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.788376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.788423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:50 crc kubenswrapper[4765]: I0319 10:21:50.788436 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:51 crc kubenswrapper[4765]: I0319 10:21:51.176710 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:51 crc kubenswrapper[4765]: I0319 10:21:51.419308 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:21:51 crc kubenswrapper[4765]: I0319 10:21:51.419520 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:51 crc kubenswrapper[4765]: I0319 10:21:51.422570 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:51 crc kubenswrapper[4765]: I0319 10:21:51.422673 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:51 crc kubenswrapper[4765]: I0319 10:21:51.422687 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.242571 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.242812 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.244036 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.244121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.244140 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:52 crc kubenswrapper[4765]: E0319 10:21:52.420141 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.591543 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.591715 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.592941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.593008 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.593020 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.960933 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.961155 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.962402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.962438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:52 crc kubenswrapper[4765]: I0319 10:21:52.962452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:53 crc kubenswrapper[4765]: I0319 10:21:53.787283 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:21:53 crc kubenswrapper[4765]: I0319 10:21:53.787390 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:21:56 crc kubenswrapper[4765]: I0319 10:21:56.296150 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 19 10:21:56 crc kubenswrapper[4765]: W0319 10:21:56.609271 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.609372 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:21:56 crc kubenswrapper[4765]: W0319 10:21:56.618320 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.618416 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.622046 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e36eec0aebd50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,LastTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:21:56 crc kubenswrapper[4765]: W0319 10:21:56.622388 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.622438 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:21:56 crc kubenswrapper[4765]: W0319 10:21:56.628371 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.628513 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.628792 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.629207 4765 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:21:56 crc kubenswrapper[4765]: I0319 10:21:56.629247 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 10:21:56 crc kubenswrapper[4765]: I0319 10:21:56.629309 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 10:21:56 crc kubenswrapper[4765]: I0319 10:21:56.635518 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 10:21:56 crc kubenswrapper[4765]: I0319 10:21:56.635611 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 19 10:21:56 crc kubenswrapper[4765]: E0319 10:21:56.636370 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:56Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.300004 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:57Z is after 2026-02-23T05:33:13Z Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.387401 4765 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Liveness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.387459 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.436248 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.438797 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="290b15e4e278b1564b630a6c59533011e6a6a0e3346141a3c2b36038cfabc144" exitCode=255 Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.438868 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"290b15e4e278b1564b630a6c59533011e6a6a0e3346141a3c2b36038cfabc144"} Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.439156 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.440791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.440841 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.440860 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:57 crc kubenswrapper[4765]: I0319 10:21:57.441596 4765 scope.go:117] "RemoveContainer" containerID="290b15e4e278b1564b630a6c59533011e6a6a0e3346141a3c2b36038cfabc144" Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.300890 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:58Z is after 2026-02-23T05:33:13Z Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.443400 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.445359 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5"} Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.445595 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.446541 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.446591 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.446609 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:58 crc kubenswrapper[4765]: I0319 10:21:58.747511 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.300802 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:21:59Z is after 2026-02-23T05:33:13Z Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.451621 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.453210 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.456315 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5" exitCode=255 Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.456357 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5"} Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.456451 4765 scope.go:117] "RemoveContainer" containerID="290b15e4e278b1564b630a6c59533011e6a6a0e3346141a3c2b36038cfabc144" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.456487 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.458129 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.458178 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.458193 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.458886 4765 scope.go:117] "RemoveContainer" containerID="61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5" Mar 19 10:21:59 crc kubenswrapper[4765]: E0319 10:21:59.459252 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.463815 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:21:59 crc kubenswrapper[4765]: I0319 10:21:59.838066 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:22:00 crc kubenswrapper[4765]: I0319 10:22:00.300117 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:00Z is after 2026-02-23T05:33:13Z Mar 19 10:22:00 crc kubenswrapper[4765]: I0319 10:22:00.461529 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 10:22:00 crc kubenswrapper[4765]: I0319 10:22:00.464185 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:00 crc kubenswrapper[4765]: I0319 10:22:00.465366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:00 crc kubenswrapper[4765]: I0319 10:22:00.465423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:00 crc kubenswrapper[4765]: I0319 10:22:00.465440 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:00 crc kubenswrapper[4765]: I0319 10:22:00.466257 4765 scope.go:117] "RemoveContainer" containerID="61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5" Mar 19 10:22:00 crc kubenswrapper[4765]: E0319 10:22:00.466522 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:01 crc kubenswrapper[4765]: I0319 10:22:01.299117 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:01Z is after 2026-02-23T05:33:13Z Mar 19 10:22:01 crc kubenswrapper[4765]: I0319 10:22:01.466452 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:01 crc kubenswrapper[4765]: I0319 10:22:01.467459 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:01 crc kubenswrapper[4765]: I0319 10:22:01.467508 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:01 crc kubenswrapper[4765]: I0319 10:22:01.467525 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:01 crc kubenswrapper[4765]: I0319 10:22:01.468301 4765 scope.go:117] "RemoveContainer" containerID="61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5" Mar 19 10:22:01 crc kubenswrapper[4765]: E0319 10:22:01.468564 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:01 crc kubenswrapper[4765]: W0319 10:22:01.905414 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:01Z is after 2026-02-23T05:33:13Z Mar 19 10:22:01 crc kubenswrapper[4765]: E0319 10:22:01.905498 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:01Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.297703 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:02Z is after 2026-02-23T05:33:13Z Mar 19 10:22:02 crc kubenswrapper[4765]: E0319 10:22:02.420339 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.596515 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.596744 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.597930 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.597973 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.597982 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.988697 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.988858 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.989721 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.989751 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:02 crc kubenswrapper[4765]: I0319 10:22:02.989762 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.010717 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.029876 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.031202 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.031239 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.031251 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.031271 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:03 crc kubenswrapper[4765]: E0319 10:22:03.033842 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:03Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 10:22:03 crc kubenswrapper[4765]: E0319 10:22:03.041522 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:03Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 10:22:03 crc kubenswrapper[4765]: W0319 10:22:03.226963 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:03Z is after 2026-02-23T05:33:13Z Mar 19 10:22:03 crc kubenswrapper[4765]: E0319 10:22:03.227081 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:03Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.299091 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:03Z is after 2026-02-23T05:33:13Z Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.472229 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.473380 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.473443 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.473460 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.788072 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:22:03 crc kubenswrapper[4765]: I0319 10:22:03.788167 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:22:04 crc kubenswrapper[4765]: I0319 10:22:04.298329 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:04Z is after 2026-02-23T05:33:13Z Mar 19 10:22:05 crc kubenswrapper[4765]: I0319 10:22:05.014512 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 10:22:05 crc kubenswrapper[4765]: E0319 10:22:05.018906 4765 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:05Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:05 crc kubenswrapper[4765]: I0319 10:22:05.300363 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:05Z is after 2026-02-23T05:33:13Z Mar 19 10:22:06 crc kubenswrapper[4765]: I0319 10:22:06.297808 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:06Z is after 2026-02-23T05:33:13Z Mar 19 10:22:06 crc kubenswrapper[4765]: E0319 10:22:06.629036 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:06Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e36eec0aebd50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,LastTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:07 crc kubenswrapper[4765]: I0319 10:22:07.297997 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:07Z is after 2026-02-23T05:33:13Z Mar 19 10:22:07 crc kubenswrapper[4765]: I0319 10:22:07.387008 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:22:07 crc kubenswrapper[4765]: I0319 10:22:07.387330 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:07 crc kubenswrapper[4765]: I0319 10:22:07.389061 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:07 crc kubenswrapper[4765]: I0319 10:22:07.389117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:07 crc kubenswrapper[4765]: I0319 10:22:07.389137 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:07 crc kubenswrapper[4765]: I0319 10:22:07.389992 4765 scope.go:117] "RemoveContainer" containerID="61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5" Mar 19 10:22:07 crc kubenswrapper[4765]: E0319 10:22:07.390210 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:07 crc kubenswrapper[4765]: W0319 10:22:07.975761 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:07Z is after 2026-02-23T05:33:13Z Mar 19 10:22:07 crc kubenswrapper[4765]: E0319 10:22:07.975848 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:07Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:08 crc kubenswrapper[4765]: I0319 10:22:08.300711 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:08Z is after 2026-02-23T05:33:13Z Mar 19 10:22:08 crc kubenswrapper[4765]: W0319 10:22:08.812364 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:08Z is after 2026-02-23T05:33:13Z Mar 19 10:22:08 crc kubenswrapper[4765]: E0319 10:22:08.812516 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:08Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:09 crc kubenswrapper[4765]: I0319 10:22:09.298088 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:09Z is after 2026-02-23T05:33:13Z Mar 19 10:22:10 crc kubenswrapper[4765]: I0319 10:22:10.034768 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:10 crc kubenswrapper[4765]: I0319 10:22:10.036408 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:10 crc kubenswrapper[4765]: I0319 10:22:10.036466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:10 crc kubenswrapper[4765]: I0319 10:22:10.036486 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:10 crc kubenswrapper[4765]: I0319 10:22:10.036522 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:10 crc kubenswrapper[4765]: E0319 10:22:10.039767 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:10Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 10:22:10 crc kubenswrapper[4765]: E0319 10:22:10.046606 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:10Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 10:22:10 crc kubenswrapper[4765]: I0319 10:22:10.301688 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:10Z is after 2026-02-23T05:33:13Z Mar 19 10:22:11 crc kubenswrapper[4765]: I0319 10:22:11.298483 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:11Z is after 2026-02-23T05:33:13Z Mar 19 10:22:12 crc kubenswrapper[4765]: I0319 10:22:12.298047 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:12Z is after 2026-02-23T05:33:13Z Mar 19 10:22:12 crc kubenswrapper[4765]: E0319 10:22:12.420952 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:22:12 crc kubenswrapper[4765]: W0319 10:22:12.626022 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:12Z is after 2026-02-23T05:33:13Z Mar 19 10:22:12 crc kubenswrapper[4765]: E0319 10:22:12.626108 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:12Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.298591 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:13Z is after 2026-02-23T05:33:13Z Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.787053 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.787144 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.787227 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.787428 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.788789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.788836 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.788918 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.789643 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"0f8f655b4daf3b924b5dd3a003a3c3a3bbd03fd6d71a22dca1a0304d621da9c9"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 10:22:13 crc kubenswrapper[4765]: I0319 10:22:13.789984 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://0f8f655b4daf3b924b5dd3a003a3c3a3bbd03fd6d71a22dca1a0304d621da9c9" gracePeriod=30 Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.305858 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:14Z is after 2026-02-23T05:33:13Z Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.503526 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.503924 4765 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="0f8f655b4daf3b924b5dd3a003a3c3a3bbd03fd6d71a22dca1a0304d621da9c9" exitCode=255 Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.503991 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"0f8f655b4daf3b924b5dd3a003a3c3a3bbd03fd6d71a22dca1a0304d621da9c9"} Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.504032 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2"} Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.504148 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.505087 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.505115 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:14 crc kubenswrapper[4765]: I0319 10:22:14.505128 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:15 crc kubenswrapper[4765]: I0319 10:22:15.299022 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:15Z is after 2026-02-23T05:33:13Z Mar 19 10:22:16 crc kubenswrapper[4765]: I0319 10:22:16.297727 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:16Z is after 2026-02-23T05:33:13Z Mar 19 10:22:16 crc kubenswrapper[4765]: E0319 10:22:16.634126 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:16Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e36eec0aebd50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,LastTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:17 crc kubenswrapper[4765]: I0319 10:22:17.040381 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:17 crc kubenswrapper[4765]: I0319 10:22:17.042091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:17 crc kubenswrapper[4765]: I0319 10:22:17.042154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:17 crc kubenswrapper[4765]: I0319 10:22:17.042166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:17 crc kubenswrapper[4765]: I0319 10:22:17.042198 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:17 crc kubenswrapper[4765]: E0319 10:22:17.045243 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:17Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 10:22:17 crc kubenswrapper[4765]: E0319 10:22:17.050592 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:17Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 10:22:17 crc kubenswrapper[4765]: I0319 10:22:17.300402 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:17Z is after 2026-02-23T05:33:13Z Mar 19 10:22:18 crc kubenswrapper[4765]: I0319 10:22:18.300577 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:18Z is after 2026-02-23T05:33:13Z Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.298577 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:19Z is after 2026-02-23T05:33:13Z Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.356060 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.357333 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.357390 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.357400 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.358195 4765 scope.go:117] "RemoveContainer" containerID="61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.519198 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.522371 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944"} Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.522595 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.523558 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.523615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.523627 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:19 crc kubenswrapper[4765]: I0319 10:22:19.837763 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.300415 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:20Z is after 2026-02-23T05:33:13Z Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.526581 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.527673 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.529359 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944" exitCode=255 Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.529407 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944"} Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.529452 4765 scope.go:117] "RemoveContainer" containerID="61d28da17c510ae5b390edbe9ce51733566549bcd52f654f8852fa5bf3cb5fa5" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.529645 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.530497 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.530533 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.530545 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.531144 4765 scope.go:117] "RemoveContainer" containerID="a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944" Mar 19 10:22:20 crc kubenswrapper[4765]: E0319 10:22:20.531343 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.786554 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.787017 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.788304 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.788382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:20 crc kubenswrapper[4765]: I0319 10:22:20.788400 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.075066 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 10:22:21 crc kubenswrapper[4765]: E0319 10:22:21.079363 4765 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:21Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:21 crc kubenswrapper[4765]: E0319 10:22:21.080588 4765 certificate_manager.go:440] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Reached backoff limit, still unable to rotate certs: timed out waiting for the condition" logger="UnhandledError" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.177488 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.298356 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:21Z is after 2026-02-23T05:33:13Z Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.534099 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.535759 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.536217 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.536706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.536771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.536791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.537690 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.537747 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.537765 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:21 crc kubenswrapper[4765]: I0319 10:22:21.538487 4765 scope.go:117] "RemoveContainer" containerID="a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944" Mar 19 10:22:21 crc kubenswrapper[4765]: E0319 10:22:21.538761 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:22 crc kubenswrapper[4765]: I0319 10:22:22.299266 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:22Z is after 2026-02-23T05:33:13Z Mar 19 10:22:22 crc kubenswrapper[4765]: E0319 10:22:22.421192 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:22:23 crc kubenswrapper[4765]: I0319 10:22:23.297767 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:23Z is after 2026-02-23T05:33:13Z Mar 19 10:22:23 crc kubenswrapper[4765]: I0319 10:22:23.787222 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:22:23 crc kubenswrapper[4765]: I0319 10:22:23.787318 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:22:23 crc kubenswrapper[4765]: W0319 10:22:23.950571 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:23Z is after 2026-02-23T05:33:13Z Mar 19 10:22:23 crc kubenswrapper[4765]: E0319 10:22:23.950671 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:23Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:24 crc kubenswrapper[4765]: I0319 10:22:24.046408 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:24 crc kubenswrapper[4765]: I0319 10:22:24.047897 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:24 crc kubenswrapper[4765]: I0319 10:22:24.048006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:24 crc kubenswrapper[4765]: I0319 10:22:24.048021 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:24 crc kubenswrapper[4765]: I0319 10:22:24.048049 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:24 crc kubenswrapper[4765]: E0319 10:22:24.051254 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:24Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 10:22:24 crc kubenswrapper[4765]: E0319 10:22:24.054925 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:24Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 10:22:24 crc kubenswrapper[4765]: I0319 10:22:24.298470 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:24Z is after 2026-02-23T05:33:13Z Mar 19 10:22:25 crc kubenswrapper[4765]: I0319 10:22:25.298996 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:25Z is after 2026-02-23T05:33:13Z Mar 19 10:22:26 crc kubenswrapper[4765]: W0319 10:22:26.006703 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:26Z is after 2026-02-23T05:33:13Z Mar 19 10:22:26 crc kubenswrapper[4765]: E0319 10:22:26.006855 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:26Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:26 crc kubenswrapper[4765]: I0319 10:22:26.298842 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:26Z is after 2026-02-23T05:33:13Z Mar 19 10:22:26 crc kubenswrapper[4765]: E0319 10:22:26.640481 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:26Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e36eec0aebd50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,LastTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:27 crc kubenswrapper[4765]: W0319 10:22:27.160331 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:27Z is after 2026-02-23T05:33:13Z Mar 19 10:22:27 crc kubenswrapper[4765]: E0319 10:22:27.160429 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:27Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:27 crc kubenswrapper[4765]: I0319 10:22:27.298727 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:27Z is after 2026-02-23T05:33:13Z Mar 19 10:22:27 crc kubenswrapper[4765]: I0319 10:22:27.387032 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:22:27 crc kubenswrapper[4765]: I0319 10:22:27.387296 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:27 crc kubenswrapper[4765]: I0319 10:22:27.388671 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:27 crc kubenswrapper[4765]: I0319 10:22:27.388711 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:27 crc kubenswrapper[4765]: I0319 10:22:27.388723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:27 crc kubenswrapper[4765]: I0319 10:22:27.389407 4765 scope.go:117] "RemoveContainer" containerID="a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944" Mar 19 10:22:27 crc kubenswrapper[4765]: E0319 10:22:27.389629 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:28 crc kubenswrapper[4765]: I0319 10:22:28.299162 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:28Z is after 2026-02-23T05:33:13Z Mar 19 10:22:29 crc kubenswrapper[4765]: I0319 10:22:29.298765 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:29Z is after 2026-02-23T05:33:13Z Mar 19 10:22:29 crc kubenswrapper[4765]: W0319 10:22:29.829588 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:29Z is after 2026-02-23T05:33:13Z Mar 19 10:22:29 crc kubenswrapper[4765]: E0319 10:22:29.829700 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 19 10:22:30 crc kubenswrapper[4765]: I0319 10:22:30.298603 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:30Z is after 2026-02-23T05:33:13Z Mar 19 10:22:31 crc kubenswrapper[4765]: I0319 10:22:31.053351 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:31 crc kubenswrapper[4765]: I0319 10:22:31.054691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:31 crc kubenswrapper[4765]: I0319 10:22:31.054758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:31 crc kubenswrapper[4765]: I0319 10:22:31.054774 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:31 crc kubenswrapper[4765]: I0319 10:22:31.054810 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:31 crc kubenswrapper[4765]: E0319 10:22:31.057653 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:31Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 10:22:31 crc kubenswrapper[4765]: E0319 10:22:31.059247 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:31Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 10:22:31 crc kubenswrapper[4765]: I0319 10:22:31.300708 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:31Z is after 2026-02-23T05:33:13Z Mar 19 10:22:32 crc kubenswrapper[4765]: I0319 10:22:32.248243 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 19 10:22:32 crc kubenswrapper[4765]: I0319 10:22:32.248424 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:32 crc kubenswrapper[4765]: I0319 10:22:32.249643 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:32 crc kubenswrapper[4765]: I0319 10:22:32.249688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:32 crc kubenswrapper[4765]: I0319 10:22:32.249700 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:32 crc kubenswrapper[4765]: I0319 10:22:32.300593 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:32Z is after 2026-02-23T05:33:13Z Mar 19 10:22:32 crc kubenswrapper[4765]: E0319 10:22:32.421888 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:22:33 crc kubenswrapper[4765]: I0319 10:22:33.298411 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:33Z is after 2026-02-23T05:33:13Z Mar 19 10:22:33 crc kubenswrapper[4765]: I0319 10:22:33.786680 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:22:33 crc kubenswrapper[4765]: I0319 10:22:33.786775 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:22:34 crc kubenswrapper[4765]: I0319 10:22:34.299301 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:34Z is after 2026-02-23T05:33:13Z Mar 19 10:22:35 crc kubenswrapper[4765]: I0319 10:22:35.298289 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:35Z is after 2026-02-23T05:33:13Z Mar 19 10:22:36 crc kubenswrapper[4765]: I0319 10:22:36.299277 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:36Z is after 2026-02-23T05:33:13Z Mar 19 10:22:36 crc kubenswrapper[4765]: E0319 10:22:36.645030 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:36Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e36eec0aebd50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,LastTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:37 crc kubenswrapper[4765]: I0319 10:22:37.299362 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:37Z is after 2026-02-23T05:33:13Z Mar 19 10:22:38 crc kubenswrapper[4765]: I0319 10:22:38.058239 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:38 crc kubenswrapper[4765]: I0319 10:22:38.060638 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:38 crc kubenswrapper[4765]: I0319 10:22:38.060711 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:38 crc kubenswrapper[4765]: I0319 10:22:38.060731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:38 crc kubenswrapper[4765]: I0319 10:22:38.060777 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:38 crc kubenswrapper[4765]: E0319 10:22:38.064093 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:38Z is after 2026-02-23T05:33:13Z" node="crc" Mar 19 10:22:38 crc kubenswrapper[4765]: E0319 10:22:38.066910 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:38Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 19 10:22:38 crc kubenswrapper[4765]: I0319 10:22:38.300091 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:38Z is after 2026-02-23T05:33:13Z Mar 19 10:22:39 crc kubenswrapper[4765]: I0319 10:22:39.301640 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:39Z is after 2026-02-23T05:33:13Z Mar 19 10:22:40 crc kubenswrapper[4765]: I0319 10:22:40.301937 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:40Z is after 2026-02-23T05:33:13Z Mar 19 10:22:40 crc kubenswrapper[4765]: I0319 10:22:40.356067 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:40 crc kubenswrapper[4765]: I0319 10:22:40.357736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:40 crc kubenswrapper[4765]: I0319 10:22:40.357786 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:40 crc kubenswrapper[4765]: I0319 10:22:40.357805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:40 crc kubenswrapper[4765]: I0319 10:22:40.358819 4765 scope.go:117] "RemoveContainer" containerID="a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944" Mar 19 10:22:40 crc kubenswrapper[4765]: E0319 10:22:40.359148 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:41 crc kubenswrapper[4765]: I0319 10:22:41.299427 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:22:41Z is after 2026-02-23T05:33:13Z Mar 19 10:22:42 crc kubenswrapper[4765]: I0319 10:22:42.299102 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:42 crc kubenswrapper[4765]: E0319 10:22:42.423226 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.300656 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.787431 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.787547 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.787648 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.787880 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.789833 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.789890 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.789911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.790753 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 19 10:22:43 crc kubenswrapper[4765]: I0319 10:22:43.790946 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2" gracePeriod=30 Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.302095 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.596695 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.598160 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.598552 4765 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2" exitCode=255 Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.598607 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2"} Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.598675 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"521a28023428a22264f3da9e2eb80eb1726c4ce7fc6c3bedcc4aa4fece1dce52"} Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.598701 4765 scope.go:117] "RemoveContainer" containerID="0f8f655b4daf3b924b5dd3a003a3c3a3bbd03fd6d71a22dca1a0304d621da9c9" Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.598910 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.600728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.600794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:44 crc kubenswrapper[4765]: I0319 10:22:44.600824 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:45 crc kubenswrapper[4765]: I0319 10:22:45.066365 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:45 crc kubenswrapper[4765]: I0319 10:22:45.073205 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:45 crc kubenswrapper[4765]: I0319 10:22:45.073244 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:45 crc kubenswrapper[4765]: I0319 10:22:45.073256 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:45 crc kubenswrapper[4765]: I0319 10:22:45.073285 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:45 crc kubenswrapper[4765]: E0319 10:22:45.075106 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 10:22:45 crc kubenswrapper[4765]: E0319 10:22:45.082246 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 10:22:45 crc kubenswrapper[4765]: I0319 10:22:45.302587 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:45 crc kubenswrapper[4765]: I0319 10:22:45.603557 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 10:22:46 crc kubenswrapper[4765]: I0319 10:22:46.299629 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.649837 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec0aebd50 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,LastTimestamp:2026-03-19 10:21:42.293347664 +0000 UTC m=+0.642293236,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.654663 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.658515 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.662093 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d6a09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,LastTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.667355 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec7f9c774 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.415705972 +0000 UTC m=+0.764651514,LastTimestamp:2026-03-19 10:21:42.415705972 +0000 UTC m=+0.764651514,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.672470 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38c8976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.456495958 +0000 UTC m=+0.805441500,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.676133 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d1d29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.456513218 +0000 UTC m=+0.805458760,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.680013 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d6a09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d6a09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,LastTimestamp:2026-03-19 10:21:42.456525328 +0000 UTC m=+0.805470870,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.683679 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38c8976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.457386982 +0000 UTC m=+0.806332534,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.687326 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d1d29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.457401383 +0000 UTC m=+0.806346935,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.690663 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d6a09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d6a09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,LastTimestamp:2026-03-19 10:21:42.457411743 +0000 UTC m=+0.806357295,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.694792 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38c8976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.457529636 +0000 UTC m=+0.806475178,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.699492 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d1d29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.457610649 +0000 UTC m=+0.806556191,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.706876 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d6a09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d6a09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,LastTimestamp:2026-03-19 10:21:42.457620359 +0000 UTC m=+0.806565901,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.713529 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38c8976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.459066539 +0000 UTC m=+0.808012091,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.718072 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d1d29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.45909091 +0000 UTC m=+0.808036462,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.723203 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d6a09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d6a09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,LastTimestamp:2026-03-19 10:21:42.459120771 +0000 UTC m=+0.808066333,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.727264 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38c8976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.459289985 +0000 UTC m=+0.808235527,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.731940 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d1d29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.459317556 +0000 UTC m=+0.808263098,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.735641 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d6a09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d6a09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,LastTimestamp:2026-03-19 10:21:42.459328307 +0000 UTC m=+0.808273849,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.739937 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38c8976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.460641233 +0000 UTC m=+0.809586775,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.744087 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d1d29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.460658214 +0000 UTC m=+0.809603756,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.748868 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d6a09\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d6a09 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341495305 +0000 UTC m=+0.690440877,LastTimestamp:2026-03-19 10:21:42.460669704 +0000 UTC m=+0.809615246,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.753884 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38c8976\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38c8976 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341437814 +0000 UTC m=+0.690383386,LastTimestamp:2026-03-19 10:21:42.461370803 +0000 UTC m=+0.810316355,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.758135 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e36eec38d1d29\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e36eec38d1d29 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.341475625 +0000 UTC m=+0.690421197,LastTimestamp:2026-03-19 10:21:42.461385664 +0000 UTC m=+0.810331206,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.763686 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e36eee15571d8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.841143768 +0000 UTC m=+1.190089310,LastTimestamp:2026-03-19 10:21:42.841143768 +0000 UTC m=+1.190089310,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.769813 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36eee1e5b66d openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.850598509 +0000 UTC m=+1.199544061,LastTimestamp:2026-03-19 10:21:42.850598509 +0000 UTC m=+1.199544061,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.775794 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36eee291e85e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.861883486 +0000 UTC m=+1.210829028,LastTimestamp:2026-03-19 10:21:42.861883486 +0000 UTC m=+1.210829028,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.782243 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36eee293e4f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.862013684 +0000 UTC m=+1.210959226,LastTimestamp:2026-03-19 10:21:42.862013684 +0000 UTC m=+1.210959226,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.786045 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36eee301e3ac openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:42.869222316 +0000 UTC m=+1.218167858,LastTimestamp:2026-03-19 10:21:42.869222316 +0000 UTC m=+1.218167858,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.790925 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef04ebed14 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.438208276 +0000 UTC m=+1.787153818,LastTimestamp:2026-03-19 10:21:43.438208276 +0000 UTC m=+1.787153818,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.795592 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e36ef04ed40cd openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.438295245 +0000 UTC m=+1.787240787,LastTimestamp:2026-03-19 10:21:43.438295245 +0000 UTC m=+1.787240787,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.799751 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef04f244e2 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.43862397 +0000 UTC m=+1.787569512,LastTimestamp:2026-03-19 10:21:43.43862397 +0000 UTC m=+1.787569512,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.803901 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef04f24708 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.43862452 +0000 UTC m=+1.787570052,LastTimestamp:2026-03-19 10:21:43.43862452 +0000 UTC m=+1.787570052,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.809084 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef04f9388c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.439079564 +0000 UTC m=+1.788025106,LastTimestamp:2026-03-19 10:21:43.439079564 +0000 UTC m=+1.788025106,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.813062 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef05e64243 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.454614083 +0000 UTC m=+1.803559625,LastTimestamp:2026-03-19 10:21:43.454614083 +0000 UTC m=+1.803559625,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.816499 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef067d877a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.464527738 +0000 UTC m=+1.813473280,LastTimestamp:2026-03-19 10:21:43.464527738 +0000 UTC m=+1.813473280,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.819859 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e36ef0681ddb3 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.464811955 +0000 UTC m=+1.813757497,LastTimestamp:2026-03-19 10:21:43.464811955 +0000 UTC m=+1.813757497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.824128 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef0681ec94 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.464815764 +0000 UTC m=+1.813761306,LastTimestamp:2026-03-19 10:21:43.464815764 +0000 UTC m=+1.813761306,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.828358 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef06893b24 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.465294628 +0000 UTC m=+1.814240190,LastTimestamp:2026-03-19 10:21:43.465294628 +0000 UTC m=+1.814240190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.832349 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef0698b924 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.466309924 +0000 UTC m=+1.815255466,LastTimestamp:2026-03-19 10:21:43.466309924 +0000 UTC m=+1.815255466,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.837340 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef173b7b16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.745411862 +0000 UTC m=+2.094357404,LastTimestamp:2026-03-19 10:21:43.745411862 +0000 UTC m=+2.094357404,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.841433 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef17f3c4c4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.757489348 +0000 UTC m=+2.106434890,LastTimestamp:2026-03-19 10:21:43.757489348 +0000 UTC m=+2.106434890,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.846892 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef180aa388 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.758988168 +0000 UTC m=+2.107933720,LastTimestamp:2026-03-19 10:21:43.758988168 +0000 UTC m=+2.107933720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.851462 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef22c6d367 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.939093351 +0000 UTC m=+2.288038893,LastTimestamp:2026-03-19 10:21:43.939093351 +0000 UTC m=+2.288038893,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.855708 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef236db9f8 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.950031352 +0000 UTC m=+2.298976894,LastTimestamp:2026-03-19 10:21:43.950031352 +0000 UTC m=+2.298976894,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.860591 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef237ffd38 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.951228216 +0000 UTC m=+2.300173758,LastTimestamp:2026-03-19 10:21:43.951228216 +0000 UTC m=+2.300173758,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.865789 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef2df5d144 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.126722372 +0000 UTC m=+2.475667924,LastTimestamp:2026-03-19 10:21:44.126722372 +0000 UTC m=+2.475667924,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.872194 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef2ed95dd5 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.141635029 +0000 UTC m=+2.490580561,LastTimestamp:2026-03-19 10:21:44.141635029 +0000 UTC m=+2.490580561,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.877038 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e36ef3ca7fcbc openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.373279932 +0000 UTC m=+2.722225494,LastTimestamp:2026-03-19 10:21:44.373279932 +0000 UTC m=+2.722225494,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.880667 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef3cb1fabf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.373934783 +0000 UTC m=+2.722880335,LastTimestamp:2026-03-19 10:21:44.373934783 +0000 UTC m=+2.722880335,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.888236 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef3d2ef078 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.382124152 +0000 UTC m=+2.731069704,LastTimestamp:2026-03-19 10:21:44.382124152 +0000 UTC m=+2.731069704,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.895669 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef3d903db3 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.388500915 +0000 UTC m=+2.737446497,LastTimestamp:2026-03-19 10:21:44.388500915 +0000 UTC m=+2.737446497,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.911612 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e36ef4b1fe2d8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.616018648 +0000 UTC m=+2.964964190,LastTimestamp:2026-03-19 10:21:44.616018648 +0000 UTC m=+2.964964190,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.916947 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef4b987099 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.623919257 +0000 UTC m=+2.972864799,LastTimestamp:2026-03-19 10:21:44.623919257 +0000 UTC m=+2.972864799,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.921517 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef4b989d5c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.623930716 +0000 UTC m=+2.972876258,LastTimestamp:2026-03-19 10:21:44.623930716 +0000 UTC m=+2.972876258,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.927303 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef4c1bb8c5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.632522949 +0000 UTC m=+2.981468491,LastTimestamp:2026-03-19 10:21:44.632522949 +0000 UTC m=+2.981468491,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.931137 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e36ef4c28e6c8 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.633386696 +0000 UTC m=+2.982332238,LastTimestamp:2026-03-19 10:21:44.633386696 +0000 UTC m=+2.982332238,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.933481 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef4c4054d5 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.634922197 +0000 UTC m=+2.983867739,LastTimestamp:2026-03-19 10:21:44.634922197 +0000 UTC m=+2.983867739,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.934737 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef4c59ed01 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.636599553 +0000 UTC m=+2.985545095,LastTimestamp:2026-03-19 10:21:44.636599553 +0000 UTC m=+2.985545095,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.938394 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef4c9f59ec openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.64114942 +0000 UTC m=+2.990094962,LastTimestamp:2026-03-19 10:21:44.64114942 +0000 UTC m=+2.990094962,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.943863 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef4db27064 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.659177572 +0000 UTC m=+3.008123114,LastTimestamp:2026-03-19 10:21:44.659177572 +0000 UTC m=+3.008123114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.948894 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef4dc06da9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.660094377 +0000 UTC m=+3.009039919,LastTimestamp:2026-03-19 10:21:44.660094377 +0000 UTC m=+3.009039919,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.953704 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef563cd93c openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.802466108 +0000 UTC m=+3.151411650,LastTimestamp:2026-03-19 10:21:44.802466108 +0000 UTC m=+3.151411650,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.957424 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef56c81cb0 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.81159288 +0000 UTC m=+3.160538422,LastTimestamp:2026-03-19 10:21:44.81159288 +0000 UTC m=+3.160538422,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.962891 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef56d5e116 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.812495126 +0000 UTC m=+3.161440668,LastTimestamp:2026-03-19 10:21:44.812495126 +0000 UTC m=+3.161440668,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.967579 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef58175b28 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.833563432 +0000 UTC m=+3.182508974,LastTimestamp:2026-03-19 10:21:44.833563432 +0000 UTC m=+3.182508974,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.971985 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef58f71629 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.848225833 +0000 UTC m=+3.197171375,LastTimestamp:2026-03-19 10:21:44.848225833 +0000 UTC m=+3.197171375,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.976557 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef590ca7ab openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:44.849639339 +0000 UTC m=+3.198584871,LastTimestamp:2026-03-19 10:21:44.849639339 +0000 UTC m=+3.198584871,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.981603 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef6231914d openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.003053389 +0000 UTC m=+3.351998931,LastTimestamp:2026-03-19 10:21:45.003053389 +0000 UTC m=+3.351998931,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.985308 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e36ef62e9dcb6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.015131318 +0000 UTC m=+3.364076860,LastTimestamp:2026-03-19 10:21:45.015131318 +0000 UTC m=+3.364076860,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.989694 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef62ed5bea openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.01536049 +0000 UTC m=+3.364306032,LastTimestamp:2026-03-19 10:21:45.01536049 +0000 UTC m=+3.364306032,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.994309 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef64c38af7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.046174455 +0000 UTC m=+3.395119987,LastTimestamp:2026-03-19 10:21:45.046174455 +0000 UTC m=+3.395119987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:46 crc kubenswrapper[4765]: E0319 10:22:46.998472 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef64d4f1c1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.047314881 +0000 UTC m=+3.396260443,LastTimestamp:2026-03-19 10:21:45.047314881 +0000 UTC m=+3.396260443,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.003629 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef6f428258 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.22226748 +0000 UTC m=+3.571213012,LastTimestamp:2026-03-19 10:21:45.22226748 +0000 UTC m=+3.571213012,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.007347 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef700a763c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.23537158 +0000 UTC m=+3.584317122,LastTimestamp:2026-03-19 10:21:45.23537158 +0000 UTC m=+3.584317122,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.011650 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef701c6e23 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.236549155 +0000 UTC m=+3.585494697,LastTimestamp:2026-03-19 10:21:45.236549155 +0000 UTC m=+3.585494697,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.016320 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef795637f1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.391331313 +0000 UTC m=+3.740276855,LastTimestamp:2026-03-19 10:21:45.391331313 +0000 UTC m=+3.740276855,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.022460 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef7c0b5438 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.436755 +0000 UTC m=+3.785700542,LastTimestamp:2026-03-19 10:21:45.436755 +0000 UTC m=+3.785700542,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.027613 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef7c9ee100 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.446424832 +0000 UTC m=+3.795370374,LastTimestamp:2026-03-19 10:21:45.446424832 +0000 UTC m=+3.795370374,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.031630 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef84deaac8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.584822984 +0000 UTC m=+3.933768516,LastTimestamp:2026-03-19 10:21:45.584822984 +0000 UTC m=+3.933768516,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.039375 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36ef85b5c56e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.598920046 +0000 UTC m=+3.947865588,LastTimestamp:2026-03-19 10:21:45.598920046 +0000 UTC m=+3.947865588,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.045504 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efb57bab2a openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.400418602 +0000 UTC m=+4.749364144,LastTimestamp:2026-03-19 10:21:46.400418602 +0000 UTC m=+4.749364144,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.050041 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efbfa5b3b3 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.570945459 +0000 UTC m=+4.919890991,LastTimestamp:2026-03-19 10:21:46.570945459 +0000 UTC m=+4.919890991,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.053872 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efc02224eb openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.579100907 +0000 UTC m=+4.928046489,LastTimestamp:2026-03-19 10:21:46.579100907 +0000 UTC m=+4.928046489,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.057813 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efc0320f24 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.580143908 +0000 UTC m=+4.929089450,LastTimestamp:2026-03-19 10:21:46.580143908 +0000 UTC m=+4.929089450,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.061650 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efc9d0c01b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.741538843 +0000 UTC m=+5.090484395,LastTimestamp:2026-03-19 10:21:46.741538843 +0000 UTC m=+5.090484395,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.066599 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efca78de09 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.752556553 +0000 UTC m=+5.101502115,LastTimestamp:2026-03-19 10:21:46.752556553 +0000 UTC m=+5.101502115,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.071184 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efca88025f openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.753548895 +0000 UTC m=+5.102494437,LastTimestamp:2026-03-19 10:21:46.753548895 +0000 UTC m=+5.102494437,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.074690 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efd3c8abd6 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.908781526 +0000 UTC m=+5.257727088,LastTimestamp:2026-03-19 10:21:46.908781526 +0000 UTC m=+5.257727088,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.078650 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efd4be846b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.924893291 +0000 UTC m=+5.273838823,LastTimestamp:2026-03-19 10:21:46.924893291 +0000 UTC m=+5.273838823,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.081886 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efd4d15b3e openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:46.926127934 +0000 UTC m=+5.275073486,LastTimestamp:2026-03-19 10:21:46.926127934 +0000 UTC m=+5.275073486,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.085221 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efdf11c355 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:47.098121045 +0000 UTC m=+5.447066627,LastTimestamp:2026-03-19 10:21:47.098121045 +0000 UTC m=+5.447066627,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.088703 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efdfc9ac96 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:47.110173846 +0000 UTC m=+5.459119398,LastTimestamp:2026-03-19 10:21:47.110173846 +0000 UTC m=+5.459119398,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.092444 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efdfda0879 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:47.111245945 +0000 UTC m=+5.460191517,LastTimestamp:2026-03-19 10:21:47.111245945 +0000 UTC m=+5.460191517,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.095897 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efed1db6e7 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:47.333785319 +0000 UTC m=+5.682730901,LastTimestamp:2026-03-19 10:21:47.333785319 +0000 UTC m=+5.682730901,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.100367 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e36efedd467d2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:47.345758162 +0000 UTC m=+5.694703714,LastTimestamp:2026-03-19 10:21:47.345758162 +0000 UTC m=+5.694703714,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.107991 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-controller-manager-crc.189e36f16dc766b6 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 19 10:22:47 crc kubenswrapper[4765]: body: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:53.787356854 +0000 UTC m=+12.136302436,LastTimestamp:2026-03-19 10:21:53.787356854 +0000 UTC m=+12.136302436,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.113786 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36f16dc89740 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:53.787434816 +0000 UTC m=+12.136380398,LastTimestamp:2026-03-19 10:21:53.787434816 +0000 UTC m=+12.136380398,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.118002 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-apiserver-crc.189e36f2172bcea7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 10:22:47 crc kubenswrapper[4765]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 10:22:47 crc kubenswrapper[4765]: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:56.629286567 +0000 UTC m=+14.978232129,LastTimestamp:2026-03-19 10:21:56.629286567 +0000 UTC m=+14.978232129,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.121800 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36f2172c9540 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:56.629337408 +0000 UTC m=+14.978282960,LastTimestamp:2026-03-19 10:21:56.629337408 +0000 UTC m=+14.978282960,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.125581 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e36f2172bcea7\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-apiserver-crc.189e36f2172bcea7 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 19 10:22:47 crc kubenswrapper[4765]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 19 10:22:47 crc kubenswrapper[4765]: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:56.629286567 +0000 UTC m=+14.978232129,LastTimestamp:2026-03-19 10:21:56.635584446 +0000 UTC m=+14.984530038,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.130062 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e36f2172c9540\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36f2172c9540 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:56.629337408 +0000 UTC m=+14.978282960,LastTimestamp:2026-03-19 10:21:56.635652137 +0000 UTC m=+14.984597719,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.134903 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-apiserver-crc.189e36f2445c6708 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Liveness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 19 10:22:47 crc kubenswrapper[4765]: body: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:57.387446024 +0000 UTC m=+15.736391566,LastTimestamp:2026-03-19 10:21:57.387446024 +0000 UTC m=+15.736391566,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.138683 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36f2445cfbb5 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Liveness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:57.387484085 +0000 UTC m=+15.736429627,LastTimestamp:2026-03-19 10:21:57.387484085 +0000 UTC m=+15.736429627,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.144507 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e36ef701c6e23\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e36ef701c6e23 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:45.236549155 +0000 UTC m=+3.585494697,LastTimestamp:2026-03-19 10:21:57.443237852 +0000 UTC m=+15.792183424,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.150586 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-controller-manager-crc.189e36f3c1df547c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 10:22:47 crc kubenswrapper[4765]: body: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:03.788145788 +0000 UTC m=+22.137091370,LastTimestamp:2026-03-19 10:22:03.788145788 +0000 UTC m=+22.137091370,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.154710 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36f3c1e06788 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:03.7882162 +0000 UTC m=+22.137161782,LastTimestamp:2026-03-19 10:22:03.7882162 +0000 UTC m=+22.137161782,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.159751 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36f3c1df547c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-controller-manager-crc.189e36f3c1df547c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 10:22:47 crc kubenswrapper[4765]: body: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:03.788145788 +0000 UTC m=+22.137091370,LastTimestamp:2026-03-19 10:22:13.78711647 +0000 UTC m=+32.136062012,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.163148 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36f3c1e06788\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36f3c1e06788 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:03.7882162 +0000 UTC m=+22.137161782,LastTimestamp:2026-03-19 10:22:13.787184852 +0000 UTC m=+32.136130394,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.166689 4765 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36f61606a970 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:13.789944176 +0000 UTC m=+32.138889738,LastTimestamp:2026-03-19 10:22:13.789944176 +0000 UTC m=+32.138889738,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.170266 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36ef0698b924\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef0698b924 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.466309924 +0000 UTC m=+1.815255466,LastTimestamp:2026-03-19 10:22:13.906090933 +0000 UTC m=+32.255036505,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.174137 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36ef173b7b16\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef173b7b16 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.745411862 +0000 UTC m=+2.094357404,LastTimestamp:2026-03-19 10:22:14.05316729 +0000 UTC m=+32.402112832,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.177601 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36ef17f3c4c4\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36ef17f3c4c4 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:21:43.757489348 +0000 UTC m=+2.106434890,LastTimestamp:2026-03-19 10:22:14.064388061 +0000 UTC m=+32.413333623,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.183773 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36f3c1df547c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-controller-manager-crc.189e36f3c1df547c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 10:22:47 crc kubenswrapper[4765]: body: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:03.788145788 +0000 UTC m=+22.137091370,LastTimestamp:2026-03-19 10:22:23.787293319 +0000 UTC m=+42.136238861,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.187508 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36f3c1e06788\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e36f3c1e06788 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:03.7882162 +0000 UTC m=+22.137161782,LastTimestamp:2026-03-19 10:22:23.787357261 +0000 UTC m=+42.136302803,Count:3,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:22:47 crc kubenswrapper[4765]: E0319 10:22:47.191429 4765 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e36f3c1df547c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 19 10:22:47 crc kubenswrapper[4765]: &Event{ObjectMeta:{kube-controller-manager-crc.189e36f3c1df547c openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 19 10:22:47 crc kubenswrapper[4765]: body: Mar 19 10:22:47 crc kubenswrapper[4765]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:22:03.788145788 +0000 UTC m=+22.137091370,LastTimestamp:2026-03-19 10:22:33.786740927 +0000 UTC m=+52.135686489,Count:4,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 19 10:22:47 crc kubenswrapper[4765]: > Mar 19 10:22:47 crc kubenswrapper[4765]: I0319 10:22:47.300461 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:48 crc kubenswrapper[4765]: I0319 10:22:48.300380 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:49 crc kubenswrapper[4765]: I0319 10:22:49.301071 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:50 crc kubenswrapper[4765]: I0319 10:22:50.301029 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:50 crc kubenswrapper[4765]: W0319 10:22:50.445431 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 19 10:22:50 crc kubenswrapper[4765]: E0319 10:22:50.445492 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 19 10:22:50 crc kubenswrapper[4765]: I0319 10:22:50.787571 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:50 crc kubenswrapper[4765]: I0319 10:22:50.787798 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:50 crc kubenswrapper[4765]: I0319 10:22:50.789216 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:50 crc kubenswrapper[4765]: I0319 10:22:50.789273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:50 crc kubenswrapper[4765]: I0319 10:22:50.789287 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:51 crc kubenswrapper[4765]: I0319 10:22:51.176839 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:51 crc kubenswrapper[4765]: I0319 10:22:51.299031 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:51 crc kubenswrapper[4765]: I0319 10:22:51.554785 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:22:51 crc kubenswrapper[4765]: I0319 10:22:51.619758 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:51 crc kubenswrapper[4765]: I0319 10:22:51.621014 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:51 crc kubenswrapper[4765]: I0319 10:22:51.621124 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:51 crc kubenswrapper[4765]: I0319 10:22:51.621191 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:52 crc kubenswrapper[4765]: E0319 10:22:52.082108 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.082991 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.084321 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.084369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.084382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.084415 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:52 crc kubenswrapper[4765]: E0319 10:22:52.090143 4765 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.299121 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.355272 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.356530 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.356565 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.356575 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:52 crc kubenswrapper[4765]: E0319 10:22:52.423874 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.621459 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.622344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.622409 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:52 crc kubenswrapper[4765]: I0319 10:22:52.622421 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.083686 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.102778 4765 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.299921 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.355217 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.356485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.356529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.356545 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.357272 4765 scope.go:117] "RemoveContainer" containerID="a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.625453 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.627674 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1"} Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.627896 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.628887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.628936 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:53 crc kubenswrapper[4765]: I0319 10:22:53.628951 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.299623 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.632317 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.633351 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.635316 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" exitCode=255 Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.635392 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1"} Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.635537 4765 scope.go:117] "RemoveContainer" containerID="a32f4be5c31af72605381eb407e6b3a9f7a38cc6ed772f797f431891435b1944" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.635739 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.636835 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.636861 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.636871 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:54 crc kubenswrapper[4765]: I0319 10:22:54.637486 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:22:54 crc kubenswrapper[4765]: E0319 10:22:54.637682 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:55 crc kubenswrapper[4765]: I0319 10:22:55.299933 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:55 crc kubenswrapper[4765]: I0319 10:22:55.639241 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 10:22:56 crc kubenswrapper[4765]: I0319 10:22:56.300543 4765 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 19 10:22:56 crc kubenswrapper[4765]: W0319 10:22:56.536073 4765 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 19 10:22:56 crc kubenswrapper[4765]: E0319 10:22:56.536152 4765 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.123133 4765 csr.go:261] certificate signing request csr-plc8m is approved, waiting to be issued Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.131036 4765 csr.go:257] certificate signing request csr-plc8m is issued Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.146340 4765 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.223819 4765 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.386524 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.386731 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.388106 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.388154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.388166 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:57 crc kubenswrapper[4765]: I0319 10:22:57.388838 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:22:57 crc kubenswrapper[4765]: E0319 10:22:57.389070 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:58 crc kubenswrapper[4765]: I0319 10:22:58.132671 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-29 19:41:34.146179089 +0000 UTC Mar 19 10:22:58 crc kubenswrapper[4765]: I0319 10:22:58.132757 4765 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 6129h18m36.013426639s for next certificate rotation Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.090448 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.091731 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.091782 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.091794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.091996 4765 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.104337 4765 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.104678 4765 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.104718 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.109986 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.110044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.110057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.110079 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.110093 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:22:59Z","lastTransitionTime":"2026-03-19T10:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.129678 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.142701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.142749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.142761 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.142784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.142798 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:22:59Z","lastTransitionTime":"2026-03-19T10:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.156377 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.169758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.170796 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.170860 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.170889 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.170916 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:22:59Z","lastTransitionTime":"2026-03-19T10:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.183950 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.192109 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.192184 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.192203 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.192230 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.192247 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:22:59Z","lastTransitionTime":"2026-03-19T10:22:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.204923 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:59Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.205368 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.205426 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.306538 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.407612 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.508735 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.609753 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.710919 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.811326 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.838600 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.838843 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.840546 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.840626 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.840828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:22:59 crc kubenswrapper[4765]: I0319 10:22:59.841947 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.842263 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:22:59 crc kubenswrapper[4765]: E0319 10:22:59.911536 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.012773 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.114113 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.214550 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.315605 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.416177 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.516337 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.616723 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.852474 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:00 crc kubenswrapper[4765]: E0319 10:23:00.952623 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.053366 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.154159 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.255158 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: I0319 10:23:01.340691 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:23:01 crc kubenswrapper[4765]: I0319 10:23:01.340920 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:23:01 crc kubenswrapper[4765]: I0319 10:23:01.342236 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:01 crc kubenswrapper[4765]: I0319 10:23:01.342294 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:01 crc kubenswrapper[4765]: I0319 10:23:01.342305 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.355558 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.455678 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.555837 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.656658 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.757227 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.857828 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:01 crc kubenswrapper[4765]: E0319 10:23:01.958588 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.059347 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.160297 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.261234 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.362424 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.424260 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.463373 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.564398 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.665508 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.766220 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.867197 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:02 crc kubenswrapper[4765]: E0319 10:23:02.968273 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.069002 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.169736 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.269888 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.370632 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.471196 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.571794 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.672995 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.773262 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.873877 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:03 crc kubenswrapper[4765]: E0319 10:23:03.974905 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.075634 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.176015 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.276771 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.377217 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.478406 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.579126 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.679256 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.779388 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.880278 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:04 crc kubenswrapper[4765]: E0319 10:23:04.981003 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.081746 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.182750 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.283744 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.384440 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.485364 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.586129 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.686257 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.787121 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.887566 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:05 crc kubenswrapper[4765]: E0319 10:23:05.988104 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.088736 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.189786 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.291069 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.391812 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.492680 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.593608 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.694514 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.794672 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.895144 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:06 crc kubenswrapper[4765]: E0319 10:23:06.996107 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.096897 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.197254 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.297505 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.397662 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.498229 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.599227 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.700140 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.801200 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:07 crc kubenswrapper[4765]: E0319 10:23:07.902119 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.003082 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.103191 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.203865 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.304667 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.405502 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.506436 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.607069 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.707872 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.808812 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:08 crc kubenswrapper[4765]: E0319 10:23:08.910087 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.010832 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.111612 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.211739 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.312852 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.413433 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.491977 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.496572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.496701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.497154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.497243 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.497314 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:09Z","lastTransitionTime":"2026-03-19T10:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.506146 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.514879 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.514926 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.514935 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.514954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.514978 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:09Z","lastTransitionTime":"2026-03-19T10:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.524933 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.531720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.531768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.531779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.531799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.531811 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:09Z","lastTransitionTime":"2026-03-19T10:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.541386 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.548572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.548668 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.548687 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.548744 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:09 crc kubenswrapper[4765]: I0319 10:23:09.548765 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:09Z","lastTransitionTime":"2026-03-19T10:23:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.562321 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:09Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.562523 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.562572 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.663640 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.763774 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.864339 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:09 crc kubenswrapper[4765]: E0319 10:23:09.965139 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.065898 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.166833 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.267413 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.368796 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.470181 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.571624 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.672163 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.772942 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.873922 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:10 crc kubenswrapper[4765]: E0319 10:23:10.975076 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.075941 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.176066 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.276889 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.377058 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.477943 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.578278 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.679223 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.780370 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.881373 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:11 crc kubenswrapper[4765]: E0319 10:23:11.981703 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.082164 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.183079 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.283206 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.383688 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.425460 4765 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.483828 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.585213 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.685849 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.786738 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.887075 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:12 crc kubenswrapper[4765]: E0319 10:23:12.988203 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.088614 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.189433 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.290135 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: I0319 10:23:13.356068 4765 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 19 10:23:13 crc kubenswrapper[4765]: I0319 10:23:13.357228 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:13 crc kubenswrapper[4765]: I0319 10:23:13.357404 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:13 crc kubenswrapper[4765]: I0319 10:23:13.357712 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:13 crc kubenswrapper[4765]: I0319 10:23:13.358613 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.358923 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.390577 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.490725 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.591660 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.692303 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.792746 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.893638 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:13 crc kubenswrapper[4765]: E0319 10:23:13.994531 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.095111 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.196004 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.296456 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.397500 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.497602 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.598899 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.699721 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.800147 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:14 crc kubenswrapper[4765]: E0319 10:23:14.900890 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.001903 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.102324 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.202445 4765 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.262941 4765 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.304198 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.304236 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.304245 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.304260 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.304270 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:15Z","lastTransitionTime":"2026-03-19T10:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.342646 4765 apiserver.go:52] "Watching apiserver" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.347806 4765 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.348108 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf"] Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.348586 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.348665 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.348729 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.348752 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.348742 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.348796 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.348621 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.349064 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.349137 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.350487 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.350714 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.351034 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.351466 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.351890 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.352029 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.352060 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.352130 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.353448 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.375989 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.387415 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.398020 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.398383 4765 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.405868 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.405900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.405910 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.405925 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.405937 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:15Z","lastTransitionTime":"2026-03-19T10:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.408060 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.416793 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.424508 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.434571 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.446285 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.449848 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.449902 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.449926 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.449948 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.449981 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450002 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450019 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450039 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450059 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450078 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450094 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450108 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450128 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450143 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450160 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450183 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450204 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450253 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450272 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450291 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450312 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450333 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450352 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450370 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450372 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450688 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450751 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450742 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450771 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451017 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451045 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451060 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451137 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451240 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451347 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451443 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451509 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451574 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.450390 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451677 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451712 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451743 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451769 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451790 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451817 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451843 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451905 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451907 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451924 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.451934 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452000 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452035 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452063 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452098 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452132 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452144 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452156 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452162 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452183 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452184 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452208 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452252 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452276 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452298 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452362 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452388 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452406 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452424 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452441 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452458 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452482 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452511 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452539 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452562 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452580 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452599 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452620 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452642 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452666 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452691 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452719 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452740 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452758 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452775 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452790 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452791 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452805 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452823 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452839 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452862 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452880 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452897 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452913 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452929 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452945 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.452990 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453009 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453028 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453045 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453062 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453079 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453097 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453114 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453132 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453150 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453168 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453187 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453204 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453222 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453240 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453256 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453259 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453272 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453289 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453308 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453326 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453344 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453360 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453377 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453393 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453399 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453410 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453468 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453510 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453544 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453631 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453673 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453713 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453731 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453753 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453790 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453829 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453872 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453907 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453936 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.453943 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454010 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454027 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454045 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454079 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454111 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454138 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454164 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454174 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454189 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454217 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454227 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454248 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454278 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454307 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454336 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454361 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454376 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454385 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454380 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454414 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454440 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454467 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454494 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454519 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454546 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454572 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454596 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454621 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454647 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454670 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454729 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454761 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454787 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454815 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454841 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454866 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455101 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455123 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455140 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455166 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455193 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455219 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455245 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455269 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455294 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455607 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455637 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455661 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455690 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455714 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455739 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455766 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455792 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455814 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455839 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455863 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455889 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455917 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455945 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456011 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456039 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456056 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456073 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456091 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456109 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456129 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456154 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456178 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456203 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456225 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456249 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456278 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456304 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456373 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456401 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456429 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456454 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456476 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456501 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456528 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456556 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456583 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456610 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456636 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456665 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456694 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456718 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456745 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456771 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456830 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456863 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456896 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456926 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456962 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457035 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457059 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457084 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457113 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457140 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457520 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460665 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.461038 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.454620 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455168 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455194 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455295 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455521 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455711 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455742 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455370 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.455898 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456297 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456489 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.456956 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457168 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457436 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.457566 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:23:15.957519523 +0000 UTC m=+94.306465105 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.463058 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.463105 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.463141 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.463231 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.463245 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458311 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458350 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.463462 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458407 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458576 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458599 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458600 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458851 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.458871 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.459023 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.459051 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.459464 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.459488 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.459511 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460028 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460466 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460520 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460831 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460871 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460894 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.460995 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.461058 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.461191 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.461243 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.461378 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.461282 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.461887 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.462043 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.462297 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.462341 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.462788 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.464556 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465018 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465182 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465324 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.463273 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465654 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465915 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465943 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465976 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465993 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468080 4765 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468099 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468117 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468130 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468146 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468161 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468176 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468188 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468200 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468211 4765 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468225 4765 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468238 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468251 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468265 4765 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468278 4765 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468292 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468621 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468640 4765 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468653 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468668 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468683 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468701 4765 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468715 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468731 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468754 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468769 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468784 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468806 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468820 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468834 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471413 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471450 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471472 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471491 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471505 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471577 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.474777 4765 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.475611 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465619 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465922 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.466316 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.466333 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.466375 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.466363 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.466402 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.457794 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.466774 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.466826 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.465088 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.467350 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.467422 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.467633 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.476226 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.476242 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:15.976221701 +0000 UTC m=+94.325167243 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.476199 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468291 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.469240 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.469491 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.469533 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.469774 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.469795 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.469994 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.470035 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.470072 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.469995 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.470305 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.470359 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.470467 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.470531 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.470768 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471354 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471452 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471481 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471392 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471688 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.471822 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.472051 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.472079 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.472290 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.472659 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.472738 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.473160 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.484168 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.486629 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.487328 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.473720 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.473774 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.473846 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.473431 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.474251 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.474401 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.474347 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.474911 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.475240 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.475232 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.475555 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.475878 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.476484 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.476606 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.476784 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.468947 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.476843 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.477075 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.477136 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.477184 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.477425 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.477578 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.478832 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.478850 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.483099 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.483255 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.486324 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.486528 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.487424 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.487545 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.488354 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:15.988326024 +0000 UTC m=+94.337271566 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.489243 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.489423 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.489590 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.489607 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.489624 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.489771 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.490262 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.490281 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.490459 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.490800 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.490997 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.490859 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.491036 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.491319 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.491384 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:15.991367045 +0000 UTC m=+94.340312577 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.491054 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.491070 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.493102 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.493229 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.493328 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.493563 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.494596 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.496137 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.496339 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.497500 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.498135 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.498267 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.498349 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.499208 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.499512 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.499621 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.501331 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.504230 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.505101 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.505135 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.505150 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.505218 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:16.005191303 +0000 UTC m=+94.354136845 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.510888 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.510950 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.511244 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.511483 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.511771 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.511907 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.512322 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.512533 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.513155 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.514064 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.514213 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.516343 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.517200 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.517244 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.517260 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.517283 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.517297 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:15Z","lastTransitionTime":"2026-03-19T10:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.523260 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.525291 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.526700 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.535492 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.572872 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.572924 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573088 4765 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573101 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573112 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573103 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573121 4765 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573186 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573201 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573177 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573215 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573228 4765 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573242 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573252 4765 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573262 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573303 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573315 4765 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573326 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573337 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573347 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573366 4765 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573376 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573386 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573395 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573405 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573416 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573426 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573436 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573446 4765 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573458 4765 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573467 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573477 4765 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573487 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573498 4765 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573510 4765 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573521 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573530 4765 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573540 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573551 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573561 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573572 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573582 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573593 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573602 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573611 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573620 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573630 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573639 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573652 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573662 4765 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573672 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573680 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573690 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573699 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573709 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573719 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573727 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573737 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573746 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573755 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573764 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573774 4765 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573783 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573792 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573801 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573813 4765 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573824 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573833 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573843 4765 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573853 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573864 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573875 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573886 4765 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573896 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573906 4765 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573916 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573926 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573934 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573944 4765 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573953 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573977 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573987 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.573997 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574006 4765 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574015 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574024 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574034 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574043 4765 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574052 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574062 4765 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574071 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574080 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574089 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574099 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574108 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574117 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574127 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574136 4765 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574147 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574156 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574166 4765 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574176 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574186 4765 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574195 4765 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574205 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574214 4765 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574224 4765 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574233 4765 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574243 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574252 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574262 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574271 4765 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574280 4765 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574291 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574300 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574309 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574318 4765 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574326 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574336 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574345 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574354 4765 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574364 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574372 4765 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574381 4765 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574389 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574398 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574408 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574417 4765 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574427 4765 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574436 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574445 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574453 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574462 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574470 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574480 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574489 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574499 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574508 4765 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574517 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574527 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574535 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574544 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574552 4765 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574561 4765 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574570 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574579 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574588 4765 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574597 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574611 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574622 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574634 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574648 4765 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574660 4765 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574671 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574681 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.574691 4765 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.620098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.620127 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.620137 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.620154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.620168 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:15Z","lastTransitionTime":"2026-03-19T10:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.669581 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 19 10:23:15 crc kubenswrapper[4765]: W0319 10:23:15.683342 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-139d640925c1ec1844d56308f9b008781c93966403adf225f0583c21d38b7f91 WatchSource:0}: Error finding container 139d640925c1ec1844d56308f9b008781c93966403adf225f0583c21d38b7f91: Status 404 returned error can't find the container with id 139d640925c1ec1844d56308f9b008781c93966403adf225f0583c21d38b7f91 Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.686513 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.687123 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:15 crc kubenswrapper[4765]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 10:23:15 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 10:23:15 crc kubenswrapper[4765]: source /etc/kubernetes/apiserver-url.env Mar 19 10:23:15 crc kubenswrapper[4765]: else Mar 19 10:23:15 crc kubenswrapper[4765]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 10:23:15 crc kubenswrapper[4765]: exit 1 Mar 19 10:23:15 crc kubenswrapper[4765]: fi Mar 19 10:23:15 crc kubenswrapper[4765]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 10:23:15 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:15 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.688826 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.695239 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 19 10:23:15 crc kubenswrapper[4765]: W0319 10:23:15.696183 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-d208ead845d2a05c167af67f11a3be8969d851cb66dbe07a58f9d4216ccafb86 WatchSource:0}: Error finding container d208ead845d2a05c167af67f11a3be8969d851cb66dbe07a58f9d4216ccafb86: Status 404 returned error can't find the container with id d208ead845d2a05c167af67f11a3be8969d851cb66dbe07a58f9d4216ccafb86 Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.699251 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.701226 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 10:23:15 crc kubenswrapper[4765]: W0319 10:23:15.705989 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-e600985384d489bb6a24188438e5fd5180967d09e76f9207758334962db4f46c WatchSource:0}: Error finding container e600985384d489bb6a24188438e5fd5180967d09e76f9207758334962db4f46c: Status 404 returned error can't find the container with id e600985384d489bb6a24188438e5fd5180967d09e76f9207758334962db4f46c Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.707911 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:15 crc kubenswrapper[4765]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 10:23:15 crc kubenswrapper[4765]: if [[ -f "/env/_master" ]]; then Mar 19 10:23:15 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: source "/env/_master" Mar 19 10:23:15 crc kubenswrapper[4765]: set +o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: fi Mar 19 10:23:15 crc kubenswrapper[4765]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 10:23:15 crc kubenswrapper[4765]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 10:23:15 crc kubenswrapper[4765]: ho_enable="--enable-hybrid-overlay" Mar 19 10:23:15 crc kubenswrapper[4765]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 10:23:15 crc kubenswrapper[4765]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 10:23:15 crc kubenswrapper[4765]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 10:23:15 crc kubenswrapper[4765]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 10:23:15 crc kubenswrapper[4765]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --webhook-host=127.0.0.1 \ Mar 19 10:23:15 crc kubenswrapper[4765]: --webhook-port=9743 \ Mar 19 10:23:15 crc kubenswrapper[4765]: ${ho_enable} \ Mar 19 10:23:15 crc kubenswrapper[4765]: --enable-interconnect \ Mar 19 10:23:15 crc kubenswrapper[4765]: --disable-approver \ Mar 19 10:23:15 crc kubenswrapper[4765]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --wait-for-kubernetes-api=200s \ Mar 19 10:23:15 crc kubenswrapper[4765]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --loglevel="${LOGLEVEL}" Mar 19 10:23:15 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:15 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.710479 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:15 crc kubenswrapper[4765]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 10:23:15 crc kubenswrapper[4765]: if [[ -f "/env/_master" ]]; then Mar 19 10:23:15 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: source "/env/_master" Mar 19 10:23:15 crc kubenswrapper[4765]: set +o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: fi Mar 19 10:23:15 crc kubenswrapper[4765]: Mar 19 10:23:15 crc kubenswrapper[4765]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 10:23:15 crc kubenswrapper[4765]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 10:23:15 crc kubenswrapper[4765]: --disable-webhook \ Mar 19 10:23:15 crc kubenswrapper[4765]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --loglevel="${LOGLEVEL}" Mar 19 10:23:15 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:15 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.711682 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.721845 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.721887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.721900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.721918 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.721933 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:15Z","lastTransitionTime":"2026-03-19T10:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.824688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.824737 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.824750 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.824768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.824780 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:15Z","lastTransitionTime":"2026-03-19T10:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.892742 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d208ead845d2a05c167af67f11a3be8969d851cb66dbe07a58f9d4216ccafb86"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.893693 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"139d640925c1ec1844d56308f9b008781c93966403adf225f0583c21d38b7f91"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.896296 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"e600985384d489bb6a24188438e5fd5180967d09e76f9207758334962db4f46c"} Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.896421 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.896905 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:15 crc kubenswrapper[4765]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 10:23:15 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 10:23:15 crc kubenswrapper[4765]: source /etc/kubernetes/apiserver-url.env Mar 19 10:23:15 crc kubenswrapper[4765]: else Mar 19 10:23:15 crc kubenswrapper[4765]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 10:23:15 crc kubenswrapper[4765]: exit 1 Mar 19 10:23:15 crc kubenswrapper[4765]: fi Mar 19 10:23:15 crc kubenswrapper[4765]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 10:23:15 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:15 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.897771 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.898296 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.902593 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:15 crc kubenswrapper[4765]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 10:23:15 crc kubenswrapper[4765]: if [[ -f "/env/_master" ]]; then Mar 19 10:23:15 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: source "/env/_master" Mar 19 10:23:15 crc kubenswrapper[4765]: set +o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: fi Mar 19 10:23:15 crc kubenswrapper[4765]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 10:23:15 crc kubenswrapper[4765]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 10:23:15 crc kubenswrapper[4765]: ho_enable="--enable-hybrid-overlay" Mar 19 10:23:15 crc kubenswrapper[4765]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 10:23:15 crc kubenswrapper[4765]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 10:23:15 crc kubenswrapper[4765]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 10:23:15 crc kubenswrapper[4765]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 10:23:15 crc kubenswrapper[4765]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --webhook-host=127.0.0.1 \ Mar 19 10:23:15 crc kubenswrapper[4765]: --webhook-port=9743 \ Mar 19 10:23:15 crc kubenswrapper[4765]: ${ho_enable} \ Mar 19 10:23:15 crc kubenswrapper[4765]: --enable-interconnect \ Mar 19 10:23:15 crc kubenswrapper[4765]: --disable-approver \ Mar 19 10:23:15 crc kubenswrapper[4765]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --wait-for-kubernetes-api=200s \ Mar 19 10:23:15 crc kubenswrapper[4765]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --loglevel="${LOGLEVEL}" Mar 19 10:23:15 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:15 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.905062 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:15 crc kubenswrapper[4765]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 10:23:15 crc kubenswrapper[4765]: if [[ -f "/env/_master" ]]; then Mar 19 10:23:15 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: source "/env/_master" Mar 19 10:23:15 crc kubenswrapper[4765]: set +o allexport Mar 19 10:23:15 crc kubenswrapper[4765]: fi Mar 19 10:23:15 crc kubenswrapper[4765]: Mar 19 10:23:15 crc kubenswrapper[4765]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 10:23:15 crc kubenswrapper[4765]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 10:23:15 crc kubenswrapper[4765]: --disable-webhook \ Mar 19 10:23:15 crc kubenswrapper[4765]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 10:23:15 crc kubenswrapper[4765]: --loglevel="${LOGLEVEL}" Mar 19 10:23:15 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:15 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.906377 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.912139 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.926540 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.930591 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.930768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.930798 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.930874 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.930946 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:15Z","lastTransitionTime":"2026-03-19T10:23:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.937065 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.947579 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.956642 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.963986 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.971218 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.978910 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.979084 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:23:16.979059722 +0000 UTC m=+95.328005264 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.979124 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.979264 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: E0319 10:23:15.979320 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:16.979308729 +0000 UTC m=+95.328254271 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.982295 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.991175 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:15 crc kubenswrapper[4765]: I0319 10:23:15.998281 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.005493 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.017621 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.033266 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.033307 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.033321 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.033370 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.033386 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.079930 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.080014 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.080051 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080118 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080145 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080155 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080160 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080173 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080185 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080191 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080228 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:17.080203547 +0000 UTC m=+95.429149089 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080252 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:17.080242199 +0000 UTC m=+95.429187741 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.080268 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:17.080258879 +0000 UTC m=+95.429204421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.135560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.135602 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.135614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.135633 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.135645 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.237565 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.237631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.237641 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.237661 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.237670 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.340287 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.340348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.340466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.340496 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.340534 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.360513 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.362890 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.368086 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.368715 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.369691 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.370229 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.370816 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.372138 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.373419 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.374868 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.375783 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.377544 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.378311 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.379254 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.380981 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.381639 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.383149 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.383588 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.384198 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.385245 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.385744 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.386771 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.387237 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.388392 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.388810 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.389458 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.390858 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.391722 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.393178 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.393853 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.395223 4765 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.395385 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.398044 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.398951 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.400221 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.402457 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.403522 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.404932 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.405942 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.407485 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.408113 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.409305 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.410053 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.411140 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.411634 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.412593 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.413198 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.414302 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.414752 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.415612 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.416086 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.416945 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.417617 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.418099 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.444566 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.444601 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.444613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.444629 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.444641 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.547276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.547337 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.547346 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.547362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.547373 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.654895 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.654947 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.654981 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.655002 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.655017 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.757370 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.757459 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.757485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.757510 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.757528 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.859664 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.859720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.859734 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.859756 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.859771 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.962186 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.962233 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.962274 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.962296 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.962306 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:16Z","lastTransitionTime":"2026-03-19T10:23:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.986333 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.986483 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:23:18.986459799 +0000 UTC m=+97.335405341 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:23:16 crc kubenswrapper[4765]: I0319 10:23:16.986537 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.986634 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:16 crc kubenswrapper[4765]: E0319 10:23:16.986688 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:18.986677255 +0000 UTC m=+97.335622797 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.065420 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.065474 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.065483 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.065500 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.065511 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.087611 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.087755 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.087798 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.087807 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.087855 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.087870 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.087988 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:19.087934224 +0000 UTC m=+97.436879816 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.087953 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.088017 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.088038 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.088060 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.088114 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:19.088091168 +0000 UTC m=+97.437036740 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.088173 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:19.088143889 +0000 UTC m=+97.437089471 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.169192 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.169253 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.169266 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.169288 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.169305 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.271551 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.271596 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.271607 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.271625 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.271638 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.355611 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.355752 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.355877 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.356209 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.356377 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:17 crc kubenswrapper[4765]: E0319 10:23:17.356575 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.375676 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.375735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.375749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.375774 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.375790 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.478893 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.478938 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.478948 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.478983 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.478993 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.551460 4765 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.582423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.582496 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.582512 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.582575 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.582609 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.685890 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.685950 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.686006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.686037 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.686054 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.789369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.789435 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.789450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.789471 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.789485 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.892987 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.893036 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.893049 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.893074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.893089 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.996158 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.996211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.996228 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.996286 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:17 crc kubenswrapper[4765]: I0319 10:23:17.996308 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:17Z","lastTransitionTime":"2026-03-19T10:23:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.100636 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.100695 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.100714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.100743 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.100763 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.204295 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.204394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.204413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.204435 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.204451 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.307246 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.307324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.307344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.307381 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.307423 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.410589 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.410696 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.410718 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.410752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.410773 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.514131 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.514213 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.514232 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.514265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.514287 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.617789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.617857 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.617876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.617904 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.617925 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.721335 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.721434 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.721464 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.721720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.721765 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.825430 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.825489 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.825500 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.825522 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.825538 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.928202 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.928239 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.928249 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.928269 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:18 crc kubenswrapper[4765]: I0319 10:23:18.928280 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:18Z","lastTransitionTime":"2026-03-19T10:23:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.011944 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.012349 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:23:23.012295187 +0000 UTC m=+101.361240769 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.012849 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.013089 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.013242 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:23.013221302 +0000 UTC m=+101.362166884 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.031170 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.031261 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.031279 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.031305 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.031324 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.114506 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.114594 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.114618 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.114724 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:23.114698246 +0000 UTC m=+101.463643818 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.114269 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.115467 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.115643 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.115728 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:23.115706413 +0000 UTC m=+101.464651985 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.115797 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.115937 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.116042 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.116063 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.116136 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:23.116117014 +0000 UTC m=+101.465062596 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.134865 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.135359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.135653 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.136006 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.136298 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.240444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.240505 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.240517 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.240538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.240552 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.343636 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.343688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.343704 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.343722 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.343731 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.355804 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.355891 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.356621 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.356772 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.356993 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.357048 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.447480 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.447522 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.447532 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.447549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.447560 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.550381 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.550648 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.550719 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.550788 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.550844 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.653346 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.653388 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.653400 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.653416 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.653427 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.756497 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.756549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.756561 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.756580 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.756593 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.859811 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.859887 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.859908 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.859940 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.860003 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.875953 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.876044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.876063 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.876089 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.876107 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.889448 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.896087 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.896120 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.896132 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.896150 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.896161 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.911839 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.917105 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.917217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.917228 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.917247 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.917261 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.927179 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.930821 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.930872 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.930884 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.930903 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.930914 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.940235 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.944612 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.944653 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.944662 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.944681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.944696 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.953712 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:19Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:19 crc kubenswrapper[4765]: E0319 10:23:19.953860 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.963482 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.963564 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.963584 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.963611 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:19 crc kubenswrapper[4765]: I0319 10:23:19.963638 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:19Z","lastTransitionTime":"2026-03-19T10:23:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.066220 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.066264 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.066276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.066294 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.066305 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.170091 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.170143 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.170155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.170183 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.170199 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.272092 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.272148 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.272158 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.272178 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.272188 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.375374 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.375420 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.375431 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.375449 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.375463 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.477894 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.477952 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.477981 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.477999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.478011 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.581406 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.581493 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.581507 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.581535 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.581552 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.684322 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.684369 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.684379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.684400 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.684411 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.787108 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.787182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.787202 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.787231 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.787251 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.890391 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.890432 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.890443 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.890460 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.890472 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.994264 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.994366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.994388 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.994415 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:20 crc kubenswrapper[4765]: I0319 10:23:20.994436 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:20Z","lastTransitionTime":"2026-03-19T10:23:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.097742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.097815 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.097831 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.097860 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.097878 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.200894 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.200932 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.200943 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.200978 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.200993 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.303706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.303771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.303781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.303797 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.303806 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.355396 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.355471 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.355568 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:21 crc kubenswrapper[4765]: E0319 10:23:21.355668 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:21 crc kubenswrapper[4765]: E0319 10:23:21.355821 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:21 crc kubenswrapper[4765]: E0319 10:23:21.355893 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.406290 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.406331 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.406340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.406354 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.406364 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.508660 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.508720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.508729 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.508767 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.508780 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.611516 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.611560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.611572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.611589 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.611602 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.714094 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.714136 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.714147 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.714174 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.714187 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.817175 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.817230 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.817243 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.817265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.817281 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.919283 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.919335 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.919345 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.919365 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:21 crc kubenswrapper[4765]: I0319 10:23:21.919376 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:21Z","lastTransitionTime":"2026-03-19T10:23:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.021772 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.021834 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.021852 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.021873 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.021891 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.124201 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.124247 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.124255 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.124271 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.124281 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.226508 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.226580 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.226597 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.226626 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.226645 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.329367 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.329402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.329414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.329431 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.329442 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.367893 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.376648 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.387781 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.399032 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.407783 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.417270 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.431928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.432001 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.432015 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.432034 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.432049 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.533807 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.533844 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.533853 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.534057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.534069 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.636616 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.636694 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.636717 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.636745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.636766 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.739442 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.739580 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.739645 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.739675 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.739694 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.842660 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.842720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.842742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.842771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.842794 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.944789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.944830 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.944841 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.944857 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:22 crc kubenswrapper[4765]: I0319 10:23:22.944867 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:22Z","lastTransitionTime":"2026-03-19T10:23:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.047041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.047086 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.047099 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.047117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.047129 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.053745 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.053903 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.054068 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:23:31.05402774 +0000 UTC m=+109.402973332 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.054071 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.054171 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:31.054154843 +0000 UTC m=+109.403100385 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.149701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.149758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.149772 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.149799 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.149824 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.155450 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.155524 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.155576 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155719 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155786 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155814 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155729 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155882 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155899 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155725 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.155884 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:31.155861104 +0000 UTC m=+109.504806656 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.156010 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:31.155989047 +0000 UTC m=+109.504934589 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.156023 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:31.156017178 +0000 UTC m=+109.504962720 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.253209 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.253284 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.253307 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.253338 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.253360 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.355028 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.355074 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.355080 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.355246 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.355376 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:23 crc kubenswrapper[4765]: E0319 10:23:23.355580 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.356409 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.356438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.356448 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.356461 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.356472 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.459833 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.459871 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.459878 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.459892 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.459900 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.562707 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.562763 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.562776 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.562794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.562805 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.665637 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.665691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.665706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.665729 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.665794 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.768193 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.768245 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.768255 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.768273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.768286 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.870881 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.870921 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.870931 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.870946 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.870976 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.973953 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.974072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.974098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.974138 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:23 crc kubenswrapper[4765]: I0319 10:23:23.974165 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:23Z","lastTransitionTime":"2026-03-19T10:23:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.077449 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.077485 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.077494 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.077514 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.077524 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.181201 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.181265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.181289 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.181326 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.181351 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.284587 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.284643 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.284652 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.284671 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.284682 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.387700 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.387759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.387771 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.387791 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.387804 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.490009 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.490080 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.490097 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.490124 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.490142 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.593534 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.593621 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.593650 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.593685 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.593710 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.697017 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.697076 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.697097 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.697121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.697140 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.799417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.799471 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.799483 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.799503 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.799516 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.902715 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.902767 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.902781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.902800 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:24 crc kubenswrapper[4765]: I0319 10:23:24.902814 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:24Z","lastTransitionTime":"2026-03-19T10:23:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.005634 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.005681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.005692 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.005710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.005724 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.108456 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.108517 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.108530 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.108548 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.108560 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.211765 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.211823 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.211840 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.211865 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.211881 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.314855 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.314911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.314928 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.314953 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.315007 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.355732 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.355733 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:25 crc kubenswrapper[4765]: E0319 10:23:25.355918 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.355760 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:25 crc kubenswrapper[4765]: E0319 10:23:25.356096 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:25 crc kubenswrapper[4765]: E0319 10:23:25.356232 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.418066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.418134 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.418147 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.418167 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.418180 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.521661 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.521737 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.521756 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.521782 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.521802 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.625398 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.625459 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.625471 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.625496 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.625509 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.728780 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.728856 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.728876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.728903 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.728922 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.831402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.831480 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.831498 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.831524 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.831542 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.934599 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.934735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.934754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.934777 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:25 crc kubenswrapper[4765]: I0319 10:23:25.934794 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:25Z","lastTransitionTime":"2026-03-19T10:23:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.037417 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.037458 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.037468 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.037490 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.037503 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.143352 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.144016 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.144057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.144088 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.144112 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.245982 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.246041 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.246054 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.246072 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.246085 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.349037 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.349129 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.349153 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.349178 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.349193 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.451879 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.451911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.451919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.451934 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.451944 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.554750 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.554804 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.554816 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.554835 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.554857 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.657150 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.657178 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.657186 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.657200 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.657208 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.760670 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.760718 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.760733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.760758 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.760769 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.862784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.862839 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.862856 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.862878 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.862893 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.965706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.965781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.965800 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.965829 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:26 crc kubenswrapper[4765]: I0319 10:23:26.965849 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:26Z","lastTransitionTime":"2026-03-19T10:23:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.068620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.068666 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.068677 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.068696 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.068706 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.170779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.170855 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.170864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.170881 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.170931 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.273832 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.273889 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.273900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.273919 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.273931 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.356450 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.356562 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.356563 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.357673 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.357736 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.357997 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.358384 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:27 crc kubenswrapper[4765]: container &Container{Name:webhook,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 10:23:27 crc kubenswrapper[4765]: if [[ -f "/env/_master" ]]; then Mar 19 10:23:27 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:27 crc kubenswrapper[4765]: source "/env/_master" Mar 19 10:23:27 crc kubenswrapper[4765]: set +o allexport Mar 19 10:23:27 crc kubenswrapper[4765]: fi Mar 19 10:23:27 crc kubenswrapper[4765]: # OVN-K will try to remove hybrid overlay node annotations even when the hybrid overlay is not enabled. Mar 19 10:23:27 crc kubenswrapper[4765]: # https://github.com/ovn-org/ovn-kubernetes/blob/ac6820df0b338a246f10f412cd5ec903bd234694/go-controller/pkg/ovn/master.go#L791 Mar 19 10:23:27 crc kubenswrapper[4765]: ho_enable="--enable-hybrid-overlay" Mar 19 10:23:27 crc kubenswrapper[4765]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start webhook" Mar 19 10:23:27 crc kubenswrapper[4765]: # extra-allowed-user: service account `ovn-kubernetes-control-plane` Mar 19 10:23:27 crc kubenswrapper[4765]: # sets pod annotations in multi-homing layer3 network controller (cluster-manager) Mar 19 10:23:27 crc kubenswrapper[4765]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 10:23:27 crc kubenswrapper[4765]: --webhook-cert-dir="/etc/webhook-cert" \ Mar 19 10:23:27 crc kubenswrapper[4765]: --webhook-host=127.0.0.1 \ Mar 19 10:23:27 crc kubenswrapper[4765]: --webhook-port=9743 \ Mar 19 10:23:27 crc kubenswrapper[4765]: ${ho_enable} \ Mar 19 10:23:27 crc kubenswrapper[4765]: --enable-interconnect \ Mar 19 10:23:27 crc kubenswrapper[4765]: --disable-approver \ Mar 19 10:23:27 crc kubenswrapper[4765]: --extra-allowed-user="system:serviceaccount:openshift-ovn-kubernetes:ovn-kubernetes-control-plane" \ Mar 19 10:23:27 crc kubenswrapper[4765]: --wait-for-kubernetes-api=200s \ Mar 19 10:23:27 crc kubenswrapper[4765]: --pod-admission-conditions="/var/run/ovnkube-identity-config/additional-pod-admission-cond.json" \ Mar 19 10:23:27 crc kubenswrapper[4765]: --loglevel="${LOGLEVEL}" Mar 19 10:23:27 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:2,ValueFrom:nil,},EnvVar{Name:KUBERNETES_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:webhook-cert,ReadOnly:false,MountPath:/etc/webhook-cert/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:27 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.361062 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:27 crc kubenswrapper[4765]: container &Container{Name:approver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c set -xe Mar 19 10:23:27 crc kubenswrapper[4765]: if [[ -f "/env/_master" ]]; then Mar 19 10:23:27 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:27 crc kubenswrapper[4765]: source "/env/_master" Mar 19 10:23:27 crc kubenswrapper[4765]: set +o allexport Mar 19 10:23:27 crc kubenswrapper[4765]: fi Mar 19 10:23:27 crc kubenswrapper[4765]: Mar 19 10:23:27 crc kubenswrapper[4765]: echo "I$(date "+%m%d %H:%M:%S.%N") - network-node-identity - start approver" Mar 19 10:23:27 crc kubenswrapper[4765]: exec /usr/bin/ovnkube-identity --k8s-apiserver=https://api-int.crc.testing:6443 \ Mar 19 10:23:27 crc kubenswrapper[4765]: --disable-webhook \ Mar 19 10:23:27 crc kubenswrapper[4765]: --csr-acceptance-conditions="/var/run/ovnkube-identity-config/additional-cert-acceptance-cond.json" \ Mar 19 10:23:27 crc kubenswrapper[4765]: --loglevel="${LOGLEVEL}" Mar 19 10:23:27 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOGLEVEL,Value:4,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:env-overrides,ReadOnly:false,MountPath:/env,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ovnkube-identity-cm,ReadOnly:false,MountPath:/var/run/ovnkube-identity-config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-s2kz5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000470000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-node-identity-vrzqb_openshift-network-node-identity(ef543e1b-8068-4ea3-b32a-61027b32e95d): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:27 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.362933 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"webhook\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"approver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-network-node-identity/network-node-identity-vrzqb" podUID="ef543e1b-8068-4ea3-b32a-61027b32e95d" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.371408 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.371618 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.371884 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.376231 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.376273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.376285 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.376305 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.376317 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.474810 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.478765 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.478822 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.478832 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.478854 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.478868 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.581662 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.581697 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.581705 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.581721 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.581732 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.684923 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.685171 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.685267 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.685338 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.685407 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.788017 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.788296 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.788356 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.788430 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.788487 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.892805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.892884 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.892906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.892934 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.893001 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.931538 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:23:27 crc kubenswrapper[4765]: E0319 10:23:27.931739 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.995645 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.995697 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.995706 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.995720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:27 crc kubenswrapper[4765]: I0319 10:23:27.995750 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:27Z","lastTransitionTime":"2026-03-19T10:23:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.098541 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.098582 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.098591 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.098609 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.098622 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.201002 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.201059 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.201066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.201084 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.201094 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.304015 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.304067 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.304083 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.304104 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.304120 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: E0319 10:23:28.357982 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:iptables-alerter,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/iptables-alerter/iptables-alerter.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONTAINER_RUNTIME_ENDPOINT,Value:unix:///run/crio/crio.sock,ValueFrom:nil,},EnvVar{Name:ALERTER_POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:iptables-alerter-script,ReadOnly:false,MountPath:/iptables-alerter,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-slash,ReadOnly:true,MountPath:/host,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rczfb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod iptables-alerter-4ln5h_openshift-network-operator(d75a4c96-2883-4a0b-bab2-0fab2b6c0b49): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:28 crc kubenswrapper[4765]: E0319 10:23:28.360195 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"iptables-alerter\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/iptables-alerter-4ln5h" podUID="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.406438 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.406630 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.406691 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.406769 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.406833 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.509949 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.510031 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.510047 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.510074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.510099 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.612795 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.612867 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.612883 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.612910 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.612927 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.716205 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.716246 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.716256 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.716271 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.716281 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.818616 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.818690 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.818699 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.818724 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.818741 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.920954 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.921018 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.921030 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.921047 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:28 crc kubenswrapper[4765]: I0319 10:23:28.921058 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:28Z","lastTransitionTime":"2026-03-19T10:23:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.023722 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.023763 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.023775 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.023792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.023804 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.125830 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.126105 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.126171 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.126259 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.126337 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.229242 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.229314 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.229327 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.229351 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.229362 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.332055 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.332124 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.332142 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.332174 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.332196 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.355431 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.355521 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.355506 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:29 crc kubenswrapper[4765]: E0319 10:23:29.355602 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:29 crc kubenswrapper[4765]: E0319 10:23:29.355690 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:29 crc kubenswrapper[4765]: E0319 10:23:29.355863 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.435330 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.435385 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.435395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.435416 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.435429 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.538763 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.538847 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.538872 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.538904 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.538929 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.641806 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.641849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.641858 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.641876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.641887 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.744570 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.744620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.744633 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.744652 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.744664 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.847134 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.847499 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.847628 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.847764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.847926 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.950468 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.950516 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.950527 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.950544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.950557 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.959829 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.959914 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.959941 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.960016 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.960085 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: E0319 10:23:29.978270 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.983757 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.983911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.984023 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.984184 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:29 crc kubenswrapper[4765]: I0319 10:23:29.984257 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:29Z","lastTransitionTime":"2026-03-19T10:23:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:29 crc kubenswrapper[4765]: E0319 10:23:29.998060 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:29Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.002657 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.002745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.002763 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.002789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.002807 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: E0319 10:23:30.018053 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.022302 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.022379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.022391 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.022413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.022425 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: E0319 10:23:30.037570 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.041573 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.041617 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.041629 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.041648 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.041662 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: E0319 10:23:30.052474 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:30 crc kubenswrapper[4765]: E0319 10:23:30.052629 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.054749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.054849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.054877 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.054911 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.054935 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.158014 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.158087 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.158107 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.158146 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.158183 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.261145 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.261199 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.261211 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.261232 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.261247 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: E0319 10:23:30.357668 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:30 crc kubenswrapper[4765]: container &Container{Name:network-operator,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,Command:[/bin/bash -c #!/bin/bash Mar 19 10:23:30 crc kubenswrapper[4765]: set -o allexport Mar 19 10:23:30 crc kubenswrapper[4765]: if [[ -f /etc/kubernetes/apiserver-url.env ]]; then Mar 19 10:23:30 crc kubenswrapper[4765]: source /etc/kubernetes/apiserver-url.env Mar 19 10:23:30 crc kubenswrapper[4765]: else Mar 19 10:23:30 crc kubenswrapper[4765]: echo "Error: /etc/kubernetes/apiserver-url.env is missing" Mar 19 10:23:30 crc kubenswrapper[4765]: exit 1 Mar 19 10:23:30 crc kubenswrapper[4765]: fi Mar 19 10:23:30 crc kubenswrapper[4765]: exec /usr/bin/cluster-network-operator start --listen=0.0.0.0:9104 Mar 19 10:23:30 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:cno,HostPort:9104,ContainerPort:9104,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:RELEASE_VERSION,Value:4.18.1,ValueFrom:nil,},EnvVar{Name:KUBE_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b97554198294bf544fbc116c94a0a1fb2ec8a4de0e926bf9d9e320135f0bee6f,ValueFrom:nil,},EnvVar{Name:KUBE_RBAC_PROXY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,ValueFrom:nil,},EnvVar{Name:MULTUS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,ValueFrom:nil,},EnvVar{Name:MULTUS_ADMISSION_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317,ValueFrom:nil,},EnvVar{Name:CNI_PLUGINS_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc,ValueFrom:nil,},EnvVar{Name:BOND_CNI_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78,ValueFrom:nil,},EnvVar{Name:WHEREABOUTS_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4,ValueFrom:nil,},EnvVar{Name:ROUTE_OVERRRIDE_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa,ValueFrom:nil,},EnvVar{Name:MULTUS_NETWORKPOLICY_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:23f833d3738d68706eb2f2868bd76bd71cee016cffa6faf5f045a60cc8c6eddd,ValueFrom:nil,},EnvVar{Name:OVN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,ValueFrom:nil,},EnvVar{Name:OVN_NB_RAFT_ELECTION_TIMER,Value:10,ValueFrom:nil,},EnvVar{Name:OVN_SB_RAFT_ELECTION_TIMER,Value:16,ValueFrom:nil,},EnvVar{Name:OVN_NORTHD_PROBE_INTERVAL,Value:10000,ValueFrom:nil,},EnvVar{Name:OVN_CONTROLLER_INACTIVITY_PROBE,Value:180000,ValueFrom:nil,},EnvVar{Name:OVN_NB_INACTIVITY_PROBE,Value:60000,ValueFrom:nil,},EnvVar{Name:EGRESS_ROUTER_CNI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,ValueFrom:nil,},EnvVar{Name:NETWORK_METRICS_DAEMON_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_SOURCE_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_CHECK_TARGET_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:NETWORK_OPERATOR_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b,ValueFrom:nil,},EnvVar{Name:CLOUD_NETWORK_CONFIG_CONTROLLER_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8048f1cb0be521f09749c0a489503cd56d85b68c6ca93380e082cfd693cd97a8,ValueFrom:nil,},EnvVar{Name:CLI_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,ValueFrom:nil,},EnvVar{Name:FRR_K8S_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5dbf844e49bb46b78586930149e5e5f5dc121014c8afd10fe36f3651967cc256,ValueFrom:nil,},EnvVar{Name:NETWORKING_CONSOLE_PLUGIN_IMAGE,Value:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd,ValueFrom:nil,},EnvVar{Name:POD_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.name,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:host-etc-kube,ReadOnly:true,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:metrics-tls,ReadOnly:false,MountPath:/var/run/secrets/serving-cert,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-rdwmf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod network-operator-58b4c7f79c-55gtf_openshift-network-operator(37a5e44f-9a88-4405-be8a-b645485e7312): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:30 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:30 crc kubenswrapper[4765]: E0319 10:23:30.359062 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"network-operator\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" podUID="37a5e44f-9a88-4405-be8a-b645485e7312" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.363263 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.363325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.363343 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.363372 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.363390 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.465640 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.465710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.465733 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.465764 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.465786 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.568626 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.568657 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.568665 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.568679 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.568689 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.672448 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.672517 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.672529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.672548 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.672561 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.775374 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.775436 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.775450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.775469 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.775479 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.878555 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.878603 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.878614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.878634 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.878647 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.981324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.981439 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.981459 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.981492 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.981519 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:30Z","lastTransitionTime":"2026-03-19T10:23:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.995010 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-wcdqx"] Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.995552 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.997558 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.998477 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 10:23:30 crc kubenswrapper[4765]: I0319 10:23:30.999434 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.007426 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.023844 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.034130 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.034642 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2b2ebd2-48e2-431e-a91d-faa3fc4f3965-hosts-file\") pod \"node-resolver-wcdqx\" (UID: \"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\") " pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.034693 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sdtb\" (UniqueName: \"kubernetes.io/projected/c2b2ebd2-48e2-431e-a91d-faa3fc4f3965-kube-api-access-4sdtb\") pod \"node-resolver-wcdqx\" (UID: \"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\") " pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.046722 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.060183 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.071409 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.083377 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.083642 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.083716 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.083805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.083872 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.085019 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.096410 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.112132 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.135348 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.135461 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:23:47.135432528 +0000 UTC m=+125.484378070 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.135696 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2b2ebd2-48e2-431e-a91d-faa3fc4f3965-hosts-file\") pod \"node-resolver-wcdqx\" (UID: \"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\") " pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.135803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sdtb\" (UniqueName: \"kubernetes.io/projected/c2b2ebd2-48e2-431e-a91d-faa3fc4f3965-kube-api-access-4sdtb\") pod \"node-resolver-wcdqx\" (UID: \"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\") " pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.135920 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.135830 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/c2b2ebd2-48e2-431e-a91d-faa3fc4f3965-hosts-file\") pod \"node-resolver-wcdqx\" (UID: \"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\") " pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.136034 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.136225 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:47.136213049 +0000 UTC m=+125.485158591 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.153492 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sdtb\" (UniqueName: \"kubernetes.io/projected/c2b2ebd2-48e2-431e-a91d-faa3fc4f3965-kube-api-access-4sdtb\") pod \"node-resolver-wcdqx\" (UID: \"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\") " pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.185746 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.185785 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.185794 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.185812 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.185821 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.236717 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.236770 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.236793 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.236914 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.236931 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.236942 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.236953 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.237017 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.237045 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:47.237027995 +0000 UTC m=+125.585973537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.237059 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.237084 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.237126 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:47.237084147 +0000 UTC m=+125.586029729 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.237216 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:47.23719345 +0000 UTC m=+125.586139022 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.288991 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.289030 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.289039 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.289053 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.289063 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.310502 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wcdqx" Mar 19 10:23:31 crc kubenswrapper[4765]: W0319 10:23:31.328899 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2b2ebd2_48e2_431e_a91d_faa3fc4f3965.slice/crio-56d0aee96061670f637537f3f20ab314bfbc131d1019778ea25ab3da6a812fce WatchSource:0}: Error finding container 56d0aee96061670f637537f3f20ab314bfbc131d1019778ea25ab3da6a812fce: Status 404 returned error can't find the container with id 56d0aee96061670f637537f3f20ab314bfbc131d1019778ea25ab3da6a812fce Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.331327 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:31 crc kubenswrapper[4765]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 19 10:23:31 crc kubenswrapper[4765]: set -uo pipefail Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 19 10:23:31 crc kubenswrapper[4765]: HOSTS_FILE="/etc/hosts" Mar 19 10:23:31 crc kubenswrapper[4765]: TEMP_FILE="/etc/hosts.tmp" Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # Make a temporary file with the old hosts file's attributes. Mar 19 10:23:31 crc kubenswrapper[4765]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 19 10:23:31 crc kubenswrapper[4765]: echo "Failed to preserve hosts file. Exiting." Mar 19 10:23:31 crc kubenswrapper[4765]: exit 1 Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: while true; do Mar 19 10:23:31 crc kubenswrapper[4765]: declare -A svc_ips Mar 19 10:23:31 crc kubenswrapper[4765]: for svc in "${services[@]}"; do Mar 19 10:23:31 crc kubenswrapper[4765]: # Fetch service IP from cluster dns if present. We make several tries Mar 19 10:23:31 crc kubenswrapper[4765]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 19 10:23:31 crc kubenswrapper[4765]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 19 10:23:31 crc kubenswrapper[4765]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 19 10:23:31 crc kubenswrapper[4765]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 10:23:31 crc kubenswrapper[4765]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 10:23:31 crc kubenswrapper[4765]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 10:23:31 crc kubenswrapper[4765]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 19 10:23:31 crc kubenswrapper[4765]: for i in ${!cmds[*]} Mar 19 10:23:31 crc kubenswrapper[4765]: do Mar 19 10:23:31 crc kubenswrapper[4765]: ips=($(eval "${cmds[i]}")) Mar 19 10:23:31 crc kubenswrapper[4765]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 19 10:23:31 crc kubenswrapper[4765]: svc_ips["${svc}"]="${ips[@]}" Mar 19 10:23:31 crc kubenswrapper[4765]: break Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # Update /etc/hosts only if we get valid service IPs Mar 19 10:23:31 crc kubenswrapper[4765]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 19 10:23:31 crc kubenswrapper[4765]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 19 10:23:31 crc kubenswrapper[4765]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 19 10:23:31 crc kubenswrapper[4765]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 19 10:23:31 crc kubenswrapper[4765]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 19 10:23:31 crc kubenswrapper[4765]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 19 10:23:31 crc kubenswrapper[4765]: sleep 60 & wait Mar 19 10:23:31 crc kubenswrapper[4765]: continue Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # Append resolver entries for services Mar 19 10:23:31 crc kubenswrapper[4765]: rc=0 Mar 19 10:23:31 crc kubenswrapper[4765]: for svc in "${!svc_ips[@]}"; do Mar 19 10:23:31 crc kubenswrapper[4765]: for ip in ${svc_ips[${svc}]}; do Mar 19 10:23:31 crc kubenswrapper[4765]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: if [[ $rc -ne 0 ]]; then Mar 19 10:23:31 crc kubenswrapper[4765]: sleep 60 & wait Mar 19 10:23:31 crc kubenswrapper[4765]: continue Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 19 10:23:31 crc kubenswrapper[4765]: # Replace /etc/hosts with our modified version if needed Mar 19 10:23:31 crc kubenswrapper[4765]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 19 10:23:31 crc kubenswrapper[4765]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: sleep 60 & wait Mar 19 10:23:31 crc kubenswrapper[4765]: unset svc_ips Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4sdtb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-wcdqx_openshift-dns(c2b2ebd2-48e2-431e-a91d-faa3fc4f3965): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:31 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.332674 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-wcdqx" podUID="c2b2ebd2-48e2-431e-a91d-faa3fc4f3965" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.339126 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-4sj5l"] Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.339578 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.340204 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-mmrh7"] Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.340879 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-79fbl"] Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.341020 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.341671 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.341748 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.341787 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.342473 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.342707 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.342772 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.343132 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.343245 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.343583 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.343753 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.343993 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.344084 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.344288 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.355278 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.355319 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.355537 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.355681 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.355321 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.355910 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.357561 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.372423 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.379989 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.391520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.391561 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.391571 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.391586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.391598 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.398902 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.408497 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.417520 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.427801 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.436012 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438684 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww7wr\" (UniqueName: \"kubernetes.io/projected/a6231872-d9e9-455e-92f1-51acc5985f6a-kube-api-access-ww7wr\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438725 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-system-cni-dir\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438745 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-os-release\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438765 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-os-release\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438782 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-kubelet\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438811 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-system-cni-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438840 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-k8s-cni-cncf-io\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438855 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6231872-d9e9-455e-92f1-51acc5985f6a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438871 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7d72ad1-7f25-4580-b845-7f66e8f78bff-mcd-auth-proxy-config\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438887 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-daemon-config\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438904 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98wwh\" (UniqueName: \"kubernetes.io/projected/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-kube-api-access-98wwh\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438920 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-cnibin\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438936 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7d72ad1-7f25-4580-b845-7f66e8f78bff-proxy-tls\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438952 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-cni-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.438983 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-socket-dir-parent\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439015 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-netns\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439029 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-hostroot\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439046 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439094 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-cnibin\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439136 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-conf-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439156 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fj62\" (UniqueName: \"kubernetes.io/projected/a7d72ad1-7f25-4580-b845-7f66e8f78bff-kube-api-access-2fj62\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439209 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-etc-kubernetes\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439226 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-cni-binary-copy\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439244 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-cni-bin\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439259 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-multus-certs\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439300 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-cni-multus\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a7d72ad1-7f25-4580-b845-7f66e8f78bff-rootfs\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.439362 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6231872-d9e9-455e-92f1-51acc5985f6a-cni-binary-copy\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.445170 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.452569 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.465801 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.477221 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.493899 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.493943 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.493953 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.494005 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.494016 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.498292 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.509835 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.526865 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.539976 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6231872-d9e9-455e-92f1-51acc5985f6a-cni-binary-copy\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540043 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-system-cni-dir\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540066 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww7wr\" (UniqueName: \"kubernetes.io/projected/a6231872-d9e9-455e-92f1-51acc5985f6a-kube-api-access-ww7wr\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540112 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-os-release\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540138 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-os-release\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540160 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-kubelet\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540201 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-system-cni-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540236 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-k8s-cni-cncf-io\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540276 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6231872-d9e9-455e-92f1-51acc5985f6a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540299 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7d72ad1-7f25-4580-b845-7f66e8f78bff-mcd-auth-proxy-config\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540353 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-daemon-config\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540378 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-98wwh\" (UniqueName: \"kubernetes.io/projected/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-kube-api-access-98wwh\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540472 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-kubelet\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540669 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-os-release\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540698 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-system-cni-dir\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540732 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-k8s-cni-cncf-io\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540744 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-os-release\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540755 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-system-cni-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-cnibin\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.541588 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a7d72ad1-7f25-4580-b845-7f66e8f78bff-mcd-auth-proxy-config\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.541699 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-daemon-config\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.541755 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/a6231872-d9e9-455e-92f1-51acc5985f6a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.541815 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a6231872-d9e9-455e-92f1-51acc5985f6a-cni-binary-copy\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.540399 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-cnibin\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.542309 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7d72ad1-7f25-4580-b845-7f66e8f78bff-proxy-tls\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.542360 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-socket-dir-parent\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.542565 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-socket-dir-parent\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.542654 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-cni-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.542387 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-cni-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543136 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-netns\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543169 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-hostroot\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543191 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543228 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2fj62\" (UniqueName: \"kubernetes.io/projected/a7d72ad1-7f25-4580-b845-7f66e8f78bff-kube-api-access-2fj62\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543249 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-cnibin\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543269 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-conf-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543319 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-etc-kubernetes\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543342 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-multus-certs\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543387 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-cni-binary-copy\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543420 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-cni-bin\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543460 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a7d72ad1-7f25-4580-b845-7f66e8f78bff-rootfs\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543492 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-cni-multus\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543569 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-cni-multus\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543627 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-netns\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543656 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-hostroot\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543804 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-etc-kubernetes\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543879 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/a7d72ad1-7f25-4580-b845-7f66e8f78bff-rootfs\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543892 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-var-lib-cni-bin\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543911 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-host-run-multus-certs\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.543943 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-cnibin\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.544117 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-multus-conf-dir\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.544385 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-cni-binary-copy\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.547124 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a6231872-d9e9-455e-92f1-51acc5985f6a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.551897 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/a7d72ad1-7f25-4580-b845-7f66e8f78bff-proxy-tls\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.566252 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-98wwh\" (UniqueName: \"kubernetes.io/projected/d9d027fd-4e70-4daf-9dd2-adefcc2a868f-kube-api-access-98wwh\") pod \"multus-mmrh7\" (UID: \"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\") " pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.566731 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fj62\" (UniqueName: \"kubernetes.io/projected/a7d72ad1-7f25-4580-b845-7f66e8f78bff-kube-api-access-2fj62\") pod \"machine-config-daemon-4sj5l\" (UID: \"a7d72ad1-7f25-4580-b845-7f66e8f78bff\") " pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.568169 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.572696 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww7wr\" (UniqueName: \"kubernetes.io/projected/a6231872-d9e9-455e-92f1-51acc5985f6a-kube-api-access-ww7wr\") pod \"multus-additional-cni-plugins-79fbl\" (UID: \"a6231872-d9e9-455e-92f1-51acc5985f6a\") " pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.583425 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.596518 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.597044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.597116 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.597146 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.597168 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.597180 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.605698 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.616811 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.624159 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.634504 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.655199 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.664938 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-mmrh7" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.669517 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fj62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.673146 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fj62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.673412 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-79fbl" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.674309 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:23:31 crc kubenswrapper[4765]: W0319 10:23:31.674558 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9d027fd_4e70_4daf_9dd2_adefcc2a868f.slice/crio-a0be712b18d15509ac9f6c199a955cc18a28362c82933369fb7097fde16d9c02 WatchSource:0}: Error finding container a0be712b18d15509ac9f6c199a955cc18a28362c82933369fb7097fde16d9c02: Status 404 returned error can't find the container with id a0be712b18d15509ac9f6c199a955cc18a28362c82933369fb7097fde16d9c02 Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.678315 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:31 crc kubenswrapper[4765]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 19 10:23:31 crc kubenswrapper[4765]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 19 10:23:31 crc kubenswrapper[4765]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98wwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-mmrh7_openshift-multus(d9d027fd-4e70-4daf-9dd2-adefcc2a868f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:31 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.679684 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-mmrh7" podUID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.689457 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvv2h"] Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.689913 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ww7wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-79fbl_openshift-multus(a6231872-d9e9-455e-92f1-51acc5985f6a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.690452 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.691128 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-79fbl" podUID="a6231872-d9e9-455e-92f1-51acc5985f6a" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.693427 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.693838 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.694244 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.694492 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.694703 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.694749 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.694917 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.699969 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.700084 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.700155 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.700217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.700271 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.702344 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.711581 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.725539 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.734666 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744795 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-netns\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744840 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-systemd\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744860 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-script-lib\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744878 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-systemd-units\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744896 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-ovn\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744912 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-node-log\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744929 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-bin\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744952 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-config\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.744998 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71cc276b-f25c-460b-b718-f058cc1d2521-ovn-node-metrics-cert\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745041 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-slash\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745057 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-log-socket\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745081 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745118 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-kubelet\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745163 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-etc-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745186 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745208 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-netd\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745225 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2sdr\" (UniqueName: \"kubernetes.io/projected/71cc276b-f25c-460b-b718-f058cc1d2521-kube-api-access-q2sdr\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745242 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-var-lib-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745261 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.745329 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-env-overrides\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.750044 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.763126 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.772937 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.791617 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.802064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.802102 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.802117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.802142 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.802158 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.804739 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.813432 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.826324 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.835538 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846019 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-netns\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846069 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-systemd\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846102 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-script-lib\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846137 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-systemd-units\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846166 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-ovn\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846199 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-node-log\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846229 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-bin\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846276 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-config\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846324 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71cc276b-f25c-460b-b718-f058cc1d2521-ovn-node-metrics-cert\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846360 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-slash\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846389 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-log-socket\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846421 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846453 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-kubelet\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846504 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-etc-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846533 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846564 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q2sdr\" (UniqueName: \"kubernetes.io/projected/71cc276b-f25c-460b-b718-f058cc1d2521-kube-api-access-q2sdr\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846592 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-netd\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846625 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-var-lib-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846659 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846694 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-env-overrides\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846943 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-slash\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847029 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-kubelet\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847056 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.846992 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-ovn-kubernetes\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847229 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-var-lib-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847243 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-node-log\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847266 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-log-socket\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847284 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-ovn\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847260 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-systemd-units\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847302 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-netns\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.848438 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-script-lib\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847336 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-systemd\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847355 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-etc-openvswitch\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847378 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847384 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-netd\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.848069 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-env-overrides\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.848209 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.848338 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-config\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.847319 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-bin\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.850202 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71cc276b-f25c-460b-b718-f058cc1d2521-ovn-node-metrics-cert\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.865732 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2sdr\" (UniqueName: \"kubernetes.io/projected/71cc276b-f25c-460b-b718-f058cc1d2521-kube-api-access-q2sdr\") pod \"ovnkube-node-kvv2h\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.904355 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.904557 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.904754 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.904993 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.905178 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:31Z","lastTransitionTime":"2026-03-19T10:23:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.941718 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerStarted","Data":"dede7bdac3dce69dbe81cd7fbb0691641cd0dbfe19f50c7a2124a9817eeed181"} Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.944941 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:egress-router-binary-copy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c,Command:[/entrypoint/cnibincopy.sh],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/egress-router-cni/bin/,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:true,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ww7wr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-additional-cni-plugins-79fbl_openshift-multus(a6231872-d9e9-455e-92f1-51acc5985f6a): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.946765 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"egress-router-binary-copy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-additional-cni-plugins-79fbl" podUID="a6231872-d9e9-455e-92f1-51acc5985f6a" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.945149 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"f0768cb658f992dde7a1f52ad13ba7be3b1bacba265523919f0a175319dd655e"} Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.948281 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wcdqx" event={"ID":"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965","Type":"ContainerStarted","Data":"56d0aee96061670f637537f3f20ab314bfbc131d1019778ea25ab3da6a812fce"} Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.949359 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:machine-config-daemon,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a,Command:[/usr/bin/machine-config-daemon],Args:[start --payload-version=4.18.1],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:health,HostPort:8798,ContainerPort:8798,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:rootfs,ReadOnly:false,MountPath:/rootfs,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fj62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/health,Port:{0 8798 },Host:127.0.0.1,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:120,TimeoutSeconds:1,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.950269 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:31 crc kubenswrapper[4765]: container &Container{Name:dns-node-resolver,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2,Command:[/bin/bash -c #!/bin/bash Mar 19 10:23:31 crc kubenswrapper[4765]: set -uo pipefail Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: trap 'jobs -p | xargs kill || true; wait; exit 0' TERM Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: OPENSHIFT_MARKER="openshift-generated-node-resolver" Mar 19 10:23:31 crc kubenswrapper[4765]: HOSTS_FILE="/etc/hosts" Mar 19 10:23:31 crc kubenswrapper[4765]: TEMP_FILE="/etc/hosts.tmp" Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: IFS=', ' read -r -a services <<< "${SERVICES}" Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # Make a temporary file with the old hosts file's attributes. Mar 19 10:23:31 crc kubenswrapper[4765]: if ! cp -f --attributes-only "${HOSTS_FILE}" "${TEMP_FILE}"; then Mar 19 10:23:31 crc kubenswrapper[4765]: echo "Failed to preserve hosts file. Exiting." Mar 19 10:23:31 crc kubenswrapper[4765]: exit 1 Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: while true; do Mar 19 10:23:31 crc kubenswrapper[4765]: declare -A svc_ips Mar 19 10:23:31 crc kubenswrapper[4765]: for svc in "${services[@]}"; do Mar 19 10:23:31 crc kubenswrapper[4765]: # Fetch service IP from cluster dns if present. We make several tries Mar 19 10:23:31 crc kubenswrapper[4765]: # to do it: IPv4, IPv6, IPv4 over TCP and IPv6 over TCP. The two last ones Mar 19 10:23:31 crc kubenswrapper[4765]: # are for deployments with Kuryr on older OpenStack (OSP13) - those do not Mar 19 10:23:31 crc kubenswrapper[4765]: # support UDP loadbalancers and require reaching DNS through TCP. Mar 19 10:23:31 crc kubenswrapper[4765]: cmds=('dig -t A @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 10:23:31 crc kubenswrapper[4765]: 'dig -t AAAA @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 10:23:31 crc kubenswrapper[4765]: 'dig -t A +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"' Mar 19 10:23:31 crc kubenswrapper[4765]: 'dig -t AAAA +tcp +retry=0 @"${NAMESERVER}" +short "${svc}.${CLUSTER_DOMAIN}"|grep -v "^;"') Mar 19 10:23:31 crc kubenswrapper[4765]: for i in ${!cmds[*]} Mar 19 10:23:31 crc kubenswrapper[4765]: do Mar 19 10:23:31 crc kubenswrapper[4765]: ips=($(eval "${cmds[i]}")) Mar 19 10:23:31 crc kubenswrapper[4765]: if [[ "$?" -eq 0 && "${#ips[@]}" -ne 0 ]]; then Mar 19 10:23:31 crc kubenswrapper[4765]: svc_ips["${svc}"]="${ips[@]}" Mar 19 10:23:31 crc kubenswrapper[4765]: break Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # Update /etc/hosts only if we get valid service IPs Mar 19 10:23:31 crc kubenswrapper[4765]: # We will not update /etc/hosts when there is coredns service outage or api unavailability Mar 19 10:23:31 crc kubenswrapper[4765]: # Stale entries could exist in /etc/hosts if the service is deleted Mar 19 10:23:31 crc kubenswrapper[4765]: if [[ -n "${svc_ips[*]-}" ]]; then Mar 19 10:23:31 crc kubenswrapper[4765]: # Build a new hosts file from /etc/hosts with our custom entries filtered out Mar 19 10:23:31 crc kubenswrapper[4765]: if ! sed --silent "/# ${OPENSHIFT_MARKER}/d; w ${TEMP_FILE}" "${HOSTS_FILE}"; then Mar 19 10:23:31 crc kubenswrapper[4765]: # Only continue rebuilding the hosts entries if its original content is preserved Mar 19 10:23:31 crc kubenswrapper[4765]: sleep 60 & wait Mar 19 10:23:31 crc kubenswrapper[4765]: continue Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # Append resolver entries for services Mar 19 10:23:31 crc kubenswrapper[4765]: rc=0 Mar 19 10:23:31 crc kubenswrapper[4765]: for svc in "${!svc_ips[@]}"; do Mar 19 10:23:31 crc kubenswrapper[4765]: for ip in ${svc_ips[${svc}]}; do Mar 19 10:23:31 crc kubenswrapper[4765]: echo "${ip} ${svc} ${svc}.${CLUSTER_DOMAIN} # ${OPENSHIFT_MARKER}" >> "${TEMP_FILE}" || rc=$? Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: if [[ $rc -ne 0 ]]; then Mar 19 10:23:31 crc kubenswrapper[4765]: sleep 60 & wait Mar 19 10:23:31 crc kubenswrapper[4765]: continue Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: Mar 19 10:23:31 crc kubenswrapper[4765]: # TODO: Update /etc/hosts atomically to avoid any inconsistent behavior Mar 19 10:23:31 crc kubenswrapper[4765]: # Replace /etc/hosts with our modified version if needed Mar 19 10:23:31 crc kubenswrapper[4765]: cmp "${TEMP_FILE}" "${HOSTS_FILE}" || cp -f "${TEMP_FILE}" "${HOSTS_FILE}" Mar 19 10:23:31 crc kubenswrapper[4765]: # TEMP_FILE is not removed to avoid file create/delete and attributes copy churn Mar 19 10:23:31 crc kubenswrapper[4765]: fi Mar 19 10:23:31 crc kubenswrapper[4765]: sleep 60 & wait Mar 19 10:23:31 crc kubenswrapper[4765]: unset svc_ips Mar 19 10:23:31 crc kubenswrapper[4765]: done Mar 19 10:23:31 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:SERVICES,Value:image-registry.openshift-image-registry.svc,ValueFrom:nil,},EnvVar{Name:NAMESERVER,Value:10.217.4.10,ValueFrom:nil,},EnvVar{Name:CLUSTER_DOMAIN,Value:cluster.local,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{22020096 0} {} 21Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:hosts-file,ReadOnly:false,MountPath:/etc/hosts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4sdtb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod node-resolver-wcdqx_openshift-dns(c2b2ebd2-48e2-431e-a91d-faa3fc4f3965): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:31 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.950930 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerStarted","Data":"a0be712b18d15509ac9f6c199a955cc18a28362c82933369fb7097fde16d9c02"} Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.951401 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"dns-node-resolver\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-dns/node-resolver-wcdqx" podUID="c2b2ebd2-48e2-431e-a91d-faa3fc4f3965" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.951638 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:kube-rbac-proxy,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09,Command:[],Args:[--secure-listen-address=0.0.0.0:9001 --config-file=/etc/kube-rbac-proxy/config-file.yaml --tls-cipher-suites=TLS_AES_128_GCM_SHA256,TLS_AES_256_GCM_SHA384,TLS_CHACHA20_POLY1305_SHA256,TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256,TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384,TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256,TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 --tls-min-version=VersionTLS12 --upstream=http://127.0.0.1:8797 --logtostderr=true --tls-cert-file=/etc/tls/private/tls.crt --tls-private-key-file=/etc/tls/private/tls.key],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:9001,ContainerPort:9001,Protocol:TCP,HostIP:,},},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{20 -3} {} 20m DecimalSI},memory: {{52428800 0} {} 50Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:proxy-tls,ReadOnly:false,MountPath:/etc/tls/private,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:mcd-auth-proxy-config,ReadOnly:false,MountPath:/etc/kube-rbac-proxy,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2fj62,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars" logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.952288 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:31 crc kubenswrapper[4765]: container &Container{Name:kube-multus,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26,Command:[/bin/bash -ec --],Args:[MULTUS_DAEMON_OPT="" Mar 19 10:23:31 crc kubenswrapper[4765]: /entrypoint/cnibincopy.sh; exec /usr/src/multus-cni/bin/multus-daemon $MULTUS_DAEMON_OPT Mar 19 10:23:31 crc kubenswrapper[4765]: ],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:RHEL8_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel8/bin/,ValueFrom:nil,},EnvVar{Name:RHEL9_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/rhel9/bin/,ValueFrom:nil,},EnvVar{Name:DEFAULT_SOURCE_DIRECTORY,Value:/usr/src/multus-cni/bin/,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:6443,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:api-int.crc.testing,ValueFrom:nil,},EnvVar{Name:MULTUS_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:K8S_NODE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{68157440 0} {} 65Mi BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:cni-binary-copy,ReadOnly:false,MountPath:/entrypoint,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:os-release,ReadOnly:false,MountPath:/host/etc/os-release,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:system-cni-dir,ReadOnly:false,MountPath:/host/etc/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-cni-dir,ReadOnly:false,MountPath:/host/run/multus/cni/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:cnibin,ReadOnly:false,MountPath:/host/opt/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-socket-dir-parent,ReadOnly:false,MountPath:/host/run/multus,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-k8s-cni-cncf-io,ReadOnly:false,MountPath:/run/k8s.cni.cncf.io,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-netns,ReadOnly:false,MountPath:/run/netns,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-bin,ReadOnly:false,MountPath:/var/lib/cni/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-cni-multus,ReadOnly:false,MountPath:/var/lib/cni/multus,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-var-lib-kubelet,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:hostroot,ReadOnly:false,MountPath:/hostroot,SubPath:,MountPropagation:*HostToContainer,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-conf-dir,ReadOnly:false,MountPath:/etc/cni/multus/net.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:multus-daemon-config,ReadOnly:true,MountPath:/etc/cni/net.d/multus.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:host-run-multus-certs,ReadOnly:false,MountPath:/etc/cni/multus/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:etc-kubernetes,ReadOnly:false,MountPath:/etc/kubernetes,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-98wwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:*true,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod multus-mmrh7_openshift-multus(d9d027fd-4e70-4daf-9dd2-adefcc2a868f): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:31 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.952757 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"machine-config-daemon\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\", failed to \"StartContainer\" for \"kube-rbac-proxy\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"]" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:23:31 crc kubenswrapper[4765]: E0319 10:23:31.953836 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-multus/multus-mmrh7" podUID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.957353 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.967400 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.977081 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.987337 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:31 crc kubenswrapper[4765]: I0319 10:23:31.999416 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.005881 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.007622 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.007656 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.007667 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.007688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.007701 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.012159 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: W0319 10:23:32.019597 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71cc276b_f25c_460b_b718_f058cc1d2521.slice/crio-7066a410bb56ff63006cf9b29ff7e8650033cc93539512bd69fa128f48deca6d WatchSource:0}: Error finding container 7066a410bb56ff63006cf9b29ff7e8650033cc93539512bd69fa128f48deca6d: Status 404 returned error can't find the container with id 7066a410bb56ff63006cf9b29ff7e8650033cc93539512bd69fa128f48deca6d Mar 19 10:23:32 crc kubenswrapper[4765]: E0319 10:23:32.022544 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:32 crc kubenswrapper[4765]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 19 10:23:32 crc kubenswrapper[4765]: apiVersion: v1 Mar 19 10:23:32 crc kubenswrapper[4765]: clusters: Mar 19 10:23:32 crc kubenswrapper[4765]: - cluster: Mar 19 10:23:32 crc kubenswrapper[4765]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 19 10:23:32 crc kubenswrapper[4765]: server: https://api-int.crc.testing:6443 Mar 19 10:23:32 crc kubenswrapper[4765]: name: default-cluster Mar 19 10:23:32 crc kubenswrapper[4765]: contexts: Mar 19 10:23:32 crc kubenswrapper[4765]: - context: Mar 19 10:23:32 crc kubenswrapper[4765]: cluster: default-cluster Mar 19 10:23:32 crc kubenswrapper[4765]: namespace: default Mar 19 10:23:32 crc kubenswrapper[4765]: user: default-auth Mar 19 10:23:32 crc kubenswrapper[4765]: name: default-context Mar 19 10:23:32 crc kubenswrapper[4765]: current-context: default-context Mar 19 10:23:32 crc kubenswrapper[4765]: kind: Config Mar 19 10:23:32 crc kubenswrapper[4765]: preferences: {} Mar 19 10:23:32 crc kubenswrapper[4765]: users: Mar 19 10:23:32 crc kubenswrapper[4765]: - name: default-auth Mar 19 10:23:32 crc kubenswrapper[4765]: user: Mar 19 10:23:32 crc kubenswrapper[4765]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 10:23:32 crc kubenswrapper[4765]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 10:23:32 crc kubenswrapper[4765]: EOF Mar 19 10:23:32 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2sdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:32 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:32 crc kubenswrapper[4765]: E0319 10:23:32.023852 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.024070 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.050057 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.064008 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.073411 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.084496 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.097407 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.109714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.109948 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.110238 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.110484 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.110788 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.115364 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.132436 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.150139 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.161791 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.463459 4765 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.469612 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:32 crc kubenswrapper[4765]: E0319 10:23:32.469876 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.476191 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.476739 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.476760 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.476768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.476782 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.476791 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.498775 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.517326 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.528486 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.541199 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.555621 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.571173 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.579028 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.579057 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.579067 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.579082 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.579091 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.588709 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.603507 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.616582 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.632413 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.642975 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.652249 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.665815 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.676325 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.681314 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.681341 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.681351 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.681368 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.681379 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.690810 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.707814 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.727881 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.740380 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.762665 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.783805 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.783840 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.783849 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.783864 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.783876 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.803486 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.843520 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.883328 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.886135 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.886172 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.886182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.886225 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.886239 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.953847 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"7066a410bb56ff63006cf9b29ff7e8650033cc93539512bd69fa128f48deca6d"} Mar 19 10:23:32 crc kubenswrapper[4765]: E0319 10:23:32.955741 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:23:32 crc kubenswrapper[4765]: init container &Container{Name:kubecfg-setup,Image:quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2,Command:[/bin/bash -c cat << EOF > /etc/ovn/kubeconfig Mar 19 10:23:32 crc kubenswrapper[4765]: apiVersion: v1 Mar 19 10:23:32 crc kubenswrapper[4765]: clusters: Mar 19 10:23:32 crc kubenswrapper[4765]: - cluster: Mar 19 10:23:32 crc kubenswrapper[4765]: certificate-authority: /var/run/secrets/kubernetes.io/serviceaccount/ca.crt Mar 19 10:23:32 crc kubenswrapper[4765]: server: https://api-int.crc.testing:6443 Mar 19 10:23:32 crc kubenswrapper[4765]: name: default-cluster Mar 19 10:23:32 crc kubenswrapper[4765]: contexts: Mar 19 10:23:32 crc kubenswrapper[4765]: - context: Mar 19 10:23:32 crc kubenswrapper[4765]: cluster: default-cluster Mar 19 10:23:32 crc kubenswrapper[4765]: namespace: default Mar 19 10:23:32 crc kubenswrapper[4765]: user: default-auth Mar 19 10:23:32 crc kubenswrapper[4765]: name: default-context Mar 19 10:23:32 crc kubenswrapper[4765]: current-context: default-context Mar 19 10:23:32 crc kubenswrapper[4765]: kind: Config Mar 19 10:23:32 crc kubenswrapper[4765]: preferences: {} Mar 19 10:23:32 crc kubenswrapper[4765]: users: Mar 19 10:23:32 crc kubenswrapper[4765]: - name: default-auth Mar 19 10:23:32 crc kubenswrapper[4765]: user: Mar 19 10:23:32 crc kubenswrapper[4765]: client-certificate: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 10:23:32 crc kubenswrapper[4765]: client-key: /etc/ovn/ovnkube-node-certs/ovnkube-client-current.pem Mar 19 10:23:32 crc kubenswrapper[4765]: EOF Mar 19 10:23:32 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-openvswitch,ReadOnly:false,MountPath:/etc/ovn/,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-q2sdr,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521): CreateContainerConfigError: services have not yet been read at least once, cannot construct envvars Mar 19 10:23:32 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:23:32 crc kubenswrapper[4765]: E0319 10:23:32.957116 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kubecfg-setup\" with CreateContainerConfigError: \"services have not yet been read at least once, cannot construct envvars\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.966181 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.974029 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.988626 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.988687 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.988702 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.988720 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:32 crc kubenswrapper[4765]: I0319 10:23:32.988734 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:32Z","lastTransitionTime":"2026-03-19T10:23:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.003339 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.044342 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.087551 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.091121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.091180 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.091205 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.091237 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.091260 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.124866 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.163399 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.193862 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.193906 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.193939 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.193997 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.194012 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.210681 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.245753 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.283438 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.297066 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.297106 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.297117 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.297137 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.297150 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.324009 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.355996 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.355996 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:33 crc kubenswrapper[4765]: E0319 10:23:33.356172 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:33 crc kubenswrapper[4765]: E0319 10:23:33.356342 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.363529 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.399382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.399434 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.399449 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.399469 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.399484 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.411434 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.501894 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.501980 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.501990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.502010 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.502019 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.605022 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.605124 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.605151 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.605182 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.605207 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.708508 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.708594 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.708619 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.708647 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.708665 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.811555 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.811606 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.811615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.811636 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.811648 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.914688 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.914735 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.914744 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.914768 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:33 crc kubenswrapper[4765]: I0319 10:23:33.914779 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:33Z","lastTransitionTime":"2026-03-19T10:23:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.017379 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.017450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.017469 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.017496 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.017514 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.120648 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.120690 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.120701 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.120717 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.120727 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.222375 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.222444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.222452 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.222466 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.222476 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.324866 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.324901 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.324909 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.324924 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.324933 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.356264 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:34 crc kubenswrapper[4765]: E0319 10:23:34.356412 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.426752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.426800 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.426812 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.426831 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.426843 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.529469 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.529715 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.529723 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.529745 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.529756 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.633093 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.633154 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.633169 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.633192 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.633209 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.735613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.735694 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.735728 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.735759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.735790 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.838579 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.838617 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.838631 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.838650 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.838663 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.942705 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.942766 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.942784 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.942810 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:34 crc kubenswrapper[4765]: I0319 10:23:34.942828 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:34Z","lastTransitionTime":"2026-03-19T10:23:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.046396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.046436 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.046445 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.046464 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.046473 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.149871 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.149948 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.149999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.150029 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.150048 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.254143 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.254227 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.254249 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.254283 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.254309 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.355856 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.355925 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:35 crc kubenswrapper[4765]: E0319 10:23:35.356153 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:35 crc kubenswrapper[4765]: E0319 10:23:35.356312 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.358488 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.358548 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.358565 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.358588 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.358607 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.461376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.461458 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.461475 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.461530 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.461548 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.564515 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.564586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.564604 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.564635 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.564653 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.668669 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.668743 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.668760 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.668790 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.668809 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.776904 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.777543 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.777561 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.777580 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.777592 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.881441 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.881511 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.881529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.881558 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.881577 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.984423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.984525 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.984546 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.984572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:35 crc kubenswrapper[4765]: I0319 10:23:35.984596 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:35Z","lastTransitionTime":"2026-03-19T10:23:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.088256 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.088858 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.088987 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.089105 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.089195 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.191789 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.191859 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.191876 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.191943 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.191993 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.294414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.294806 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.295234 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.295574 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.295889 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.356244 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:36 crc kubenswrapper[4765]: E0319 10:23:36.356419 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.398997 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.399082 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.399101 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.399130 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.399150 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.501973 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.502026 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.502038 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.502060 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.502073 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.605419 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.605828 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.606255 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.606601 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.607003 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.709582 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.709617 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.709625 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.709640 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.709649 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.813336 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.813384 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.813394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.813414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.813425 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.916165 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.916217 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.916232 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.916254 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:36 crc kubenswrapper[4765]: I0319 10:23:36.916271 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:36Z","lastTransitionTime":"2026-03-19T10:23:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.019323 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.019394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.019416 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.019446 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.019465 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.122152 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.122197 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.122208 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.122227 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.122243 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.172762 4765 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.225121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.225183 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.225202 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.225226 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.225241 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.328276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.328383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.328406 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.328439 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.328459 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.355654 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.355688 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:37 crc kubenswrapper[4765]: E0319 10:23:37.355862 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:37 crc kubenswrapper[4765]: E0319 10:23:37.356158 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.431157 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.431235 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.431253 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.431294 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.431325 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.530508 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-dntfk"] Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.530894 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.533498 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.533526 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.533560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.533586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.533615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.533636 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.534313 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.535679 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.537330 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.546767 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.562323 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.581673 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.593991 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.605644 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.619853 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.621153 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/144a13fc-5921-4106-8a80-210689777cd4-host\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.621207 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6pl4k\" (UniqueName: \"kubernetes.io/projected/144a13fc-5921-4106-8a80-210689777cd4-kube-api-access-6pl4k\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.621254 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/144a13fc-5921-4106-8a80-210689777cd4-serviceca\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.637818 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.637885 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.637900 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.637925 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.637939 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.639669 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.655730 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.677478 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.689607 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.720044 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.722596 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6pl4k\" (UniqueName: \"kubernetes.io/projected/144a13fc-5921-4106-8a80-210689777cd4-kube-api-access-6pl4k\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.722698 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/144a13fc-5921-4106-8a80-210689777cd4-serviceca\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.722758 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/144a13fc-5921-4106-8a80-210689777cd4-host\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.722851 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/144a13fc-5921-4106-8a80-210689777cd4-host\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.723861 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/144a13fc-5921-4106-8a80-210689777cd4-serviceca\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.732776 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.740276 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.740318 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.740329 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.740348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.740360 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.740774 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6pl4k\" (UniqueName: \"kubernetes.io/projected/144a13fc-5921-4106-8a80-210689777cd4-kube-api-access-6pl4k\") pod \"node-ca-dntfk\" (UID: \"144a13fc-5921-4106-8a80-210689777cd4\") " pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.745437 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.754309 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.843837 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.843923 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.843947 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.844395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.844661 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.849183 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-dntfk" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.947684 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.947725 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.947736 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.947753 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.947764 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:37Z","lastTransitionTime":"2026-03-19T10:23:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:37 crc kubenswrapper[4765]: I0319 10:23:37.974228 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dntfk" event={"ID":"144a13fc-5921-4106-8a80-210689777cd4","Type":"ContainerStarted","Data":"0d826de7006ec90ca34e0910f3af9613c61e96499c03da6f0fc03c04c69c46c0"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.050314 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.050382 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.050400 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.050431 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.050453 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.153428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.153463 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.153472 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.153487 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.153498 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.255742 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.255781 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.255792 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.255810 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.255822 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.355571 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:38 crc kubenswrapper[4765]: E0319 10:23:38.355769 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.358780 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.358833 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.358846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.358868 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.358880 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.461450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.461487 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.461503 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.461520 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.461531 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.564034 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.564071 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.564083 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.564102 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.564115 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.667312 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.667398 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.667418 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.667446 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.667467 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.770883 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.771002 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.771027 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.771064 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.771086 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.874752 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.874823 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.874841 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.874873 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.874890 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.977879 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.977939 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.977990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.978018 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.978038 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:38Z","lastTransitionTime":"2026-03-19T10:23:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.979863 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-dntfk" event={"ID":"144a13fc-5921-4106-8a80-210689777cd4","Type":"ContainerStarted","Data":"db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394"} Mar 19 10:23:38 crc kubenswrapper[4765]: I0319 10:23:38.996595 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.005322 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.014859 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.030737 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.040730 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.054493 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.066668 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.081101 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.081142 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.081153 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.081175 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.081186 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.081263 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.101158 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.127136 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.139829 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.167529 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.184034 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.184120 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.184139 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.184167 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.184191 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.193317 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.207427 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.289200 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.289273 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.289293 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.289324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.289343 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.356182 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.356258 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:39 crc kubenswrapper[4765]: E0319 10:23:39.356400 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:39 crc kubenswrapper[4765]: E0319 10:23:39.357284 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.394046 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.394130 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.394158 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.394195 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.394225 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.497755 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.497821 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.497843 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.497872 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.497892 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.602344 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.602504 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.602528 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.602562 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.602592 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.705192 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.705612 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.705634 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.705714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.705735 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.807890 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.807934 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.807952 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.807999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.808012 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.910529 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.910566 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.910578 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.910599 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.910613 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:39Z","lastTransitionTime":"2026-03-19T10:23:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.985426 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8"} Mar 19 10:23:39 crc kubenswrapper[4765]: I0319 10:23:39.985501 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.000103 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.013496 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.013549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.013560 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.013579 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.013902 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.015074 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.033530 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.062248 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.078096 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.078137 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.078149 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.078167 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.078180 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.086540 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: E0319 10:23:40.101361 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.109198 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.109257 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.109275 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.109303 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.109323 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.126230 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: E0319 10:23:40.130387 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.135693 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.135767 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.135782 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.135801 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.135814 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: E0319 10:23:40.155334 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.158412 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.159668 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.159743 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.159759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.159785 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.159808 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.175830 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: E0319 10:23:40.176707 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.181334 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.181384 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.181396 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.181423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.181441 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.194866 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: E0319 10:23:40.198567 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:40Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: E0319 10:23:40.198753 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.200803 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.200835 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.200846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.200862 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.200873 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.209397 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.223597 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.242684 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.256328 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.267539 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:40Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.303377 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.303544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.303648 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.303762 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.303845 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.356288 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:40 crc kubenswrapper[4765]: E0319 10:23:40.356454 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.407366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.407869 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.407883 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.407909 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.407922 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.510076 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.510133 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.510150 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.510175 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.510188 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.612888 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.613177 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.613280 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.613366 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.613444 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.715544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.715572 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.715580 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.715596 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.715606 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.817474 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.817519 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.817531 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.817549 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.817561 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.919644 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.919673 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.919681 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.919695 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:40 crc kubenswrapper[4765]: I0319 10:23:40.919703 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:40Z","lastTransitionTime":"2026-03-19T10:23:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.022254 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.022324 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.022340 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.022359 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.022371 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.124383 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.124423 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.124431 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.124447 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.124456 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.227568 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.227654 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.227679 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.227710 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.227746 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.329749 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.329801 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.329809 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.329827 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.329837 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.355312 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.355354 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:41 crc kubenswrapper[4765]: E0319 10:23:41.355520 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:41 crc kubenswrapper[4765]: E0319 10:23:41.355681 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.433074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.433126 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.433142 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.433165 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.433184 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.536100 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.536143 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.536152 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.536171 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.536182 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.639206 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.639262 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.639279 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.639325 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.639343 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.743172 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.743265 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.743284 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.743311 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.743330 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.846209 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.846253 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.846263 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.846284 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.846297 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.949034 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.949080 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.949093 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.949110 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:41 crc kubenswrapper[4765]: I0319 10:23:41.949124 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:41Z","lastTransitionTime":"2026-03-19T10:23:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.051361 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.051402 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.051413 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.051430 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.051441 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:42Z","lastTransitionTime":"2026-03-19T10:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.153389 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.153426 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.153434 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.153450 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.153463 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:42Z","lastTransitionTime":"2026-03-19T10:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.256148 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.256196 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.256207 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.256227 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.256254 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:42Z","lastTransitionTime":"2026-03-19T10:23:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.356354 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:42 crc kubenswrapper[4765]: E0319 10:23:42.356587 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:42 crc kubenswrapper[4765]: E0319 10:23:42.357057 4765 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.358331 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.373117 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.387331 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.403568 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.415566 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.428898 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.442435 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.462237 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.473655 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: E0319 10:23:42.473801 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.496225 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.509005 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.521600 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.534243 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.546897 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.560802 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.995030 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.996850 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127"} Mar 19 10:23:42 crc kubenswrapper[4765]: I0319 10:23:42.997227 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.009393 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.023820 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.043670 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.064012 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.085519 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.108563 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.121666 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.142075 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.154812 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.168639 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.181245 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.196489 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.208760 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.224983 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.355389 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.355451 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:43 crc kubenswrapper[4765]: E0319 10:23:43.355583 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:43 crc kubenswrapper[4765]: E0319 10:23:43.355757 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.373077 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9"] Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.373817 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.378290 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.378404 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.392047 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.405876 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.421506 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.434908 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.447934 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.462591 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.487190 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.487490 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nf4m5\" (UniqueName: \"kubernetes.io/projected/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-kube-api-access-nf4m5\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.487530 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.487569 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.487593 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.504173 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.522845 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.536607 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.552507 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.577192 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.588618 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nf4m5\" (UniqueName: \"kubernetes.io/projected/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-kube-api-access-nf4m5\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.588659 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.588680 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.588699 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.592400 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.592664 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.594325 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.597023 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.606556 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.610021 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nf4m5\" (UniqueName: \"kubernetes.io/projected/e0a1287f-75c3-4e89-899e-d0cdd6575f9c-kube-api-access-nf4m5\") pod \"ovnkube-control-plane-749d76644c-kv7q9\" (UID: \"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.620195 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:43 crc kubenswrapper[4765]: I0319 10:23:43.728141 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" Mar 19 10:23:43 crc kubenswrapper[4765]: W0319 10:23:43.739829 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0a1287f_75c3_4e89_899e_d0cdd6575f9c.slice/crio-dbdff5e71bfdfafacae9ee528ef8b05838500de121a3ef2562793d2357578111 WatchSource:0}: Error finding container dbdff5e71bfdfafacae9ee528ef8b05838500de121a3ef2562793d2357578111: Status 404 returned error can't find the container with id dbdff5e71bfdfafacae9ee528ef8b05838500de121a3ef2562793d2357578111 Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.002881 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" event={"ID":"e0a1287f-75c3-4e89-899e-d0cdd6575f9c","Type":"ContainerStarted","Data":"dbdff5e71bfdfafacae9ee528ef8b05838500de121a3ef2562793d2357578111"} Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.004764 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wcdqx" event={"ID":"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965","Type":"ContainerStarted","Data":"74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929"} Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.008087 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4"} Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.010152 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab"} Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.010207 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d"} Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.024135 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.038434 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.058808 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.070900 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.089482 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.104104 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-t8k4k"] Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.105714 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:44 crc kubenswrapper[4765]: E0319 10:23:44.105822 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.108513 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.120637 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.133118 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.143939 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.154258 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.168463 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.179481 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.190167 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.195815 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j85hd\" (UniqueName: \"kubernetes.io/projected/ab39cf0a-a301-484b-9328-19acff8edae9-kube-api-access-j85hd\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.195864 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.202702 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.214781 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.225352 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.239190 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.250899 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.264502 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.278586 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.296744 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.296824 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j85hd\" (UniqueName: \"kubernetes.io/projected/ab39cf0a-a301-484b-9328-19acff8edae9-kube-api-access-j85hd\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:44 crc kubenswrapper[4765]: E0319 10:23:44.296914 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:44 crc kubenswrapper[4765]: E0319 10:23:44.297001 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:44.79698086 +0000 UTC m=+123.145926402 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.297286 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.306193 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.311536 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j85hd\" (UniqueName: \"kubernetes.io/projected/ab39cf0a-a301-484b-9328-19acff8edae9-kube-api-access-j85hd\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.315186 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.340225 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.352666 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.355563 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:44 crc kubenswrapper[4765]: E0319 10:23:44.355764 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.363262 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.363937 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.374285 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.382839 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.393295 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.406577 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.421614 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:44 crc kubenswrapper[4765]: I0319 10:23:44.810435 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:44 crc kubenswrapper[4765]: E0319 10:23:44.810799 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:44 crc kubenswrapper[4765]: E0319 10:23:44.811036 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:45.811004769 +0000 UTC m=+124.159950331 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.015833 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" event={"ID":"e0a1287f-75c3-4e89-899e-d0cdd6575f9c","Type":"ContainerStarted","Data":"4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad"} Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.015879 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" event={"ID":"e0a1287f-75c3-4e89-899e-d0cdd6575f9c","Type":"ContainerStarted","Data":"44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6"} Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.037318 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.053669 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.069657 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.086913 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.111261 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.126613 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.142655 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.170054 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.183844 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.194503 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.205399 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.217223 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.226553 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.250744 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.267005 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.283620 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.300760 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.355490 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.355587 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.355572 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:45 crc kubenswrapper[4765]: E0319 10:23:45.355777 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:45 crc kubenswrapper[4765]: E0319 10:23:45.356018 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:45 crc kubenswrapper[4765]: E0319 10:23:45.356198 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:45 crc kubenswrapper[4765]: I0319 10:23:45.822067 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:45 crc kubenswrapper[4765]: E0319 10:23:45.822325 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:45 crc kubenswrapper[4765]: E0319 10:23:45.822462 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:47.822433934 +0000 UTC m=+126.171379476 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.025975 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf"} Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.046018 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.068693 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.083487 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.111566 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.135585 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.160336 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.172362 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.190279 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.203427 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.218384 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.232770 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.253521 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.266068 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.276160 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.293399 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.307326 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.322344 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:46Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:46 crc kubenswrapper[4765]: I0319 10:23:46.355979 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:46 crc kubenswrapper[4765]: E0319 10:23:46.356323 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.039436 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerStarted","Data":"10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0"} Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.042307 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7" exitCode=0 Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.042593 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7"} Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.056429 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.079707 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.097583 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.125748 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.137642 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.137809 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.137952 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.137974 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:24:19.137912202 +0000 UTC m=+157.486857744 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.138049 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:24:19.138027365 +0000 UTC m=+157.486972907 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.153186 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.169476 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.189911 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.206428 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.219171 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.234453 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.239761 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.239822 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.239884 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240336 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240383 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240398 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240468 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240519 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240537 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240479 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:24:19.240458784 +0000 UTC m=+157.589404326 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240624 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:24:19.240598588 +0000 UTC m=+157.589544310 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240707 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.240746 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:24:19.240737872 +0000 UTC m=+157.589683624 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.264074 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.280891 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.294188 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.311839 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.325507 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.343054 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.355831 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.356028 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.356093 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.356123 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.356486 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.356554 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.366210 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.386220 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.409733 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.426155 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.446946 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.459238 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.469151 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.474873 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.487535 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.510193 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.525879 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.538990 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.556102 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.570280 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.590813 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"services have not yet been read at least once, cannot construct envvars\\\",\\\"reason\\\":\\\"CreateContainerConfigError\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.612675 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.627995 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.641105 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.662710 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:47Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:47 crc kubenswrapper[4765]: I0319 10:23:47.848080 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.848408 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:47 crc kubenswrapper[4765]: E0319 10:23:47.848557 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:51.848511348 +0000 UTC m=+130.197457080 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.047169 4765 generic.go:334] "Generic (PLEG): container finished" podID="a6231872-d9e9-455e-92f1-51acc5985f6a" containerID="d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a" exitCode=0 Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.047459 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerDied","Data":"d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a"} Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.053442 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91"} Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.053479 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2"} Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.053495 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0"} Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.053511 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367"} Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.053522 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88"} Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.053536 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2"} Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.064453 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.088491 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.104194 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.122049 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.136580 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.150072 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.166401 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.186908 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.200759 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.213010 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.227462 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.240833 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.261283 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.272688 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.283471 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.295614 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.308352 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:48Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:48 crc kubenswrapper[4765]: I0319 10:23:48.356007 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:48 crc kubenswrapper[4765]: E0319 10:23:48.356168 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.059371 4765 generic.go:334] "Generic (PLEG): container finished" podID="a6231872-d9e9-455e-92f1-51acc5985f6a" containerID="a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52" exitCode=0 Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.059420 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerDied","Data":"a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52"} Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.081698 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.095249 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.106231 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.117991 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.129039 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.144123 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.159431 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.174913 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.192227 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.217421 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.237370 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.252531 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.271850 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.287939 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.309467 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.320310 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.332916 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:49Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.356275 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.356302 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:49 crc kubenswrapper[4765]: I0319 10:23:49.356300 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:49 crc kubenswrapper[4765]: E0319 10:23:49.356424 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:49 crc kubenswrapper[4765]: E0319 10:23:49.356525 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:49 crc kubenswrapper[4765]: E0319 10:23:49.356596 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.063769 4765 generic.go:334] "Generic (PLEG): container finished" podID="a6231872-d9e9-455e-92f1-51acc5985f6a" containerID="654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144" exitCode=0 Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.063838 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerDied","Data":"654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144"} Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.075829 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a"} Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.081748 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.097819 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.116543 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.129600 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.142172 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.155139 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.170687 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.189234 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.204285 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.217347 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.236189 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.248214 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.262933 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.293934 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.307111 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.307174 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.307191 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.307215 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.307228 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:50Z","lastTransitionTime":"2026-03-19T10:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.308313 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.321864 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: E0319 10:23:50.324500 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.330544 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.330620 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.330637 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.331583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.331624 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:50Z","lastTransitionTime":"2026-03-19T10:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.333379 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: E0319 10:23:50.346104 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.350110 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.350152 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.350163 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.350181 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.350191 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:50Z","lastTransitionTime":"2026-03-19T10:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.355240 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:50 crc kubenswrapper[4765]: E0319 10:23:50.355411 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:50 crc kubenswrapper[4765]: E0319 10:23:50.364576 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.368184 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.368218 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.368227 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.368243 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.368252 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:50Z","lastTransitionTime":"2026-03-19T10:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:50 crc kubenswrapper[4765]: E0319 10:23:50.381166 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.385074 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.385099 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.385108 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.385125 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:23:50 crc kubenswrapper[4765]: I0319 10:23:50.385138 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:23:50Z","lastTransitionTime":"2026-03-19T10:23:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:23:50 crc kubenswrapper[4765]: E0319 10:23:50.400132 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:50Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:50 crc kubenswrapper[4765]: E0319 10:23:50.400244 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.090510 4765 generic.go:334] "Generic (PLEG): container finished" podID="a6231872-d9e9-455e-92f1-51acc5985f6a" containerID="8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c" exitCode=0 Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.090612 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerDied","Data":"8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c"} Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.108383 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.129323 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.152075 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.172830 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.191260 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.205670 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.218743 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.235950 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.251417 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.263517 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.277622 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.303372 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.314937 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.325536 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.347643 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.355743 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.355831 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.355743 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:51 crc kubenswrapper[4765]: E0319 10:23:51.355865 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:51 crc kubenswrapper[4765]: E0319 10:23:51.355948 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:51 crc kubenswrapper[4765]: E0319 10:23:51.356041 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.362792 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.374677 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:51Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:51 crc kubenswrapper[4765]: I0319 10:23:51.894328 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:51 crc kubenswrapper[4765]: E0319 10:23:51.894476 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:51 crc kubenswrapper[4765]: E0319 10:23:51.894548 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:23:59.894530875 +0000 UTC m=+138.243476417 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.098399 4765 generic.go:334] "Generic (PLEG): container finished" podID="a6231872-d9e9-455e-92f1-51acc5985f6a" containerID="b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b" exitCode=0 Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.098447 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerDied","Data":"b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b"} Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.119671 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.137061 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.149910 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.172660 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.196579 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.212410 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.232366 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.254384 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.269080 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.281121 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.295598 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.316906 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.329537 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.340939 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.356208 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:52 crc kubenswrapper[4765]: E0319 10:23:52.356409 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.362246 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.378790 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.392460 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.410427 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.421708 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.440875 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.455166 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.467914 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: E0319 10:23:52.476341 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.484551 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.500112 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.519576 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.532502 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.543293 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.554476 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.566573 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.582257 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.600804 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.616066 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.631942 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:52 crc kubenswrapper[4765]: I0319 10:23:52.646455 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:52Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.106986 4765 generic.go:334] "Generic (PLEG): container finished" podID="a6231872-d9e9-455e-92f1-51acc5985f6a" containerID="c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa" exitCode=0 Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.107016 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerDied","Data":"c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa"} Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.114186 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6"} Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.114641 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.114690 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.123474 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.138297 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.147092 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.154493 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.170938 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.183741 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.196373 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.208097 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.228425 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.248058 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.262275 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.275387 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.288917 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.306983 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.318561 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.330451 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.344185 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.355985 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.356142 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:53 crc kubenswrapper[4765]: E0319 10:23:53.356288 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.356376 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.356735 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:53 crc kubenswrapper[4765]: E0319 10:23:53.356876 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:53 crc kubenswrapper[4765]: E0319 10:23:53.358260 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.368832 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.380263 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.391471 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.410437 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.422834 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.432891 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.444148 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.457122 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.475879 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.485701 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.496697 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.507542 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.517609 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.531366 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.544069 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.557417 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:53 crc kubenswrapper[4765]: I0319 10:23:53.572745 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:53Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.122694 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" event={"ID":"a6231872-d9e9-455e-92f1-51acc5985f6a","Type":"ContainerStarted","Data":"51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec"} Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.123159 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.143057 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.150746 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.162175 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.177067 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.200707 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.225502 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.245824 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.261739 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.277691 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.297107 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.309067 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.323768 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.347744 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.355382 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:54 crc kubenswrapper[4765]: E0319 10:23:54.355611 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.370110 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.384788 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.397280 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.410179 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.428220 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.440475 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.453131 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.469301 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.483721 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.497098 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.515155 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.525656 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.535586 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.556637 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.570908 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.582244 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.594753 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.607538 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.620110 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.635691 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.647434 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:54 crc kubenswrapper[4765]: I0319 10:23:54.662417 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:54Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:55 crc kubenswrapper[4765]: I0319 10:23:55.355661 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:55 crc kubenswrapper[4765]: I0319 10:23:55.355716 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:55 crc kubenswrapper[4765]: I0319 10:23:55.355871 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:55 crc kubenswrapper[4765]: E0319 10:23:55.356931 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:55 crc kubenswrapper[4765]: E0319 10:23:55.357102 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:55 crc kubenswrapper[4765]: E0319 10:23:55.357044 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.131492 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/0.log" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.134177 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6" exitCode=1 Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.134227 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6"} Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.134776 4765 scope.go:117] "RemoveContainer" containerID="bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.159353 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.177181 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.190545 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.213453 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.227807 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.242404 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.258082 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.270944 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.286137 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.303638 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.317881 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.328810 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.343056 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.355765 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:56 crc kubenswrapper[4765]: E0319 10:23:56.355866 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.357145 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.377329 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:55Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0319 10:23:55.485872 6723 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 10:23:55.486034 6723 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 10:23:55.486266 6723 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0319 10:23:55.486403 6723 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 10:23:55.486602 6723 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 10:23:55.486728 6723 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 10:23:55.486930 6723 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 10:23:55.487356 6723 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.388600 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:56 crc kubenswrapper[4765]: I0319 10:23:56.398256 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:56Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.139559 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/1.log" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.141320 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/0.log" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.143871 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6" exitCode=1 Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.143914 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6"} Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.143952 4765 scope.go:117] "RemoveContainer" containerID="bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.144863 4765 scope.go:117] "RemoveContainer" containerID="5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6" Mar 19 10:23:57 crc kubenswrapper[4765]: E0319 10:23:57.145049 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.158895 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.174740 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.188395 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.202504 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.217634 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.230242 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.241527 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.264170 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.278188 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.292786 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.309855 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.329852 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.355261 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.355310 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.355436 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:57 crc kubenswrapper[4765]: E0319 10:23:57.355527 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:57 crc kubenswrapper[4765]: E0319 10:23:57.355654 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:57 crc kubenswrapper[4765]: E0319 10:23:57.355759 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.359061 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bea53c89d5c93c64e2a93e2cd125c76dc1a1a7e0b2de68ec7e298402b56743a6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:55Z\\\",\\\"message\\\":\\\"rmers/externalversions/factory.go:140\\\\nI0319 10:23:55.485872 6723 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 10:23:55.486034 6723 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 10:23:55.486266 6723 reflector.go:311] Stopping reflector *v1.NetworkAttachmentDefinition (0s) from github.com/k8snetworkplumbingwg/network-attachment-definition-client/pkg/client/informers/externalversions/factory.go:117\\\\nI0319 10:23:55.486403 6723 reflector.go:311] Stopping reflector *v1.EgressFirewall (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0319 10:23:55.486602 6723 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0319 10:23:55.486728 6723 reflector.go:311] Stopping reflector *v1.Node (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 10:23:55.486930 6723 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0319 10:23:55.487356 6723 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:57Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0319 10:23:57.018257 6907 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.373066 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.391679 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.410718 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: I0319 10:23:57.425152 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:57Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:57 crc kubenswrapper[4765]: E0319 10:23:57.478453 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.149031 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/1.log" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.157278 4765 scope.go:117] "RemoveContainer" containerID="5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6" Mar 19 10:23:58 crc kubenswrapper[4765]: E0319 10:23:58.157601 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.171498 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.195915 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.211326 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.227160 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.247892 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:57Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0319 10:23:57.018257 6907 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.260618 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.275066 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.299255 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.322280 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.339935 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.355953 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:23:58 crc kubenswrapper[4765]: E0319 10:23:58.356130 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.360122 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.374773 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.385605 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.400639 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.416053 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.432601 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:58 crc kubenswrapper[4765]: I0319 10:23:58.444862 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:58Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.355826 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.356023 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.356154 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:23:59 crc kubenswrapper[4765]: E0319 10:23:59.356161 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:23:59 crc kubenswrapper[4765]: E0319 10:23:59.356249 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:23:59 crc kubenswrapper[4765]: E0319 10:23:59.356338 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.844393 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.861437 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.891340 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:57Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0319 10:23:57.018257 6907 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.906125 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.919235 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.946125 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.961716 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.976286 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.989425 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:23:59 crc kubenswrapper[4765]: E0319 10:23:59.989566 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:59 crc kubenswrapper[4765]: E0319 10:23:59.989623 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:24:15.989606727 +0000 UTC m=+154.338552269 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:23:59 crc kubenswrapper[4765]: I0319 10:23:59.993677 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:23:59Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.006084 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.017687 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.033157 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.048358 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.070811 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.086560 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.106942 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.125838 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.143355 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.355514 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:00 crc kubenswrapper[4765]: E0319 10:24:00.355721 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.459253 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.459301 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.459310 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.459327 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.459340 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:00Z","lastTransitionTime":"2026-03-19T10:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:00 crc kubenswrapper[4765]: E0319 10:24:00.477307 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.481586 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.481614 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.481623 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.481635 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.481644 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:00Z","lastTransitionTime":"2026-03-19T10:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:00 crc kubenswrapper[4765]: E0319 10:24:00.496221 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.500855 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.500912 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.500933 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.500981 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.501000 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:00Z","lastTransitionTime":"2026-03-19T10:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:00 crc kubenswrapper[4765]: E0319 10:24:00.519129 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.524501 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.524565 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.524583 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.524612 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.524629 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:00Z","lastTransitionTime":"2026-03-19T10:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:00 crc kubenswrapper[4765]: E0319 10:24:00.546182 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.550990 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.551032 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.551044 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.551065 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:00 crc kubenswrapper[4765]: I0319 10:24:00.551080 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:00Z","lastTransitionTime":"2026-03-19T10:24:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:00 crc kubenswrapper[4765]: E0319 10:24:00.565947 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:00Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:00Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:00 crc kubenswrapper[4765]: E0319 10:24:00.566138 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:24:01 crc kubenswrapper[4765]: I0319 10:24:01.355363 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:01 crc kubenswrapper[4765]: I0319 10:24:01.355375 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:01 crc kubenswrapper[4765]: I0319 10:24:01.355469 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:01 crc kubenswrapper[4765]: E0319 10:24:01.356096 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:01 crc kubenswrapper[4765]: E0319 10:24:01.355854 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:01 crc kubenswrapper[4765]: E0319 10:24:01.356252 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.006883 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.007906 4765 scope.go:117] "RemoveContainer" containerID="5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6" Mar 19 10:24:02 crc kubenswrapper[4765]: E0319 10:24:02.008125 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.356093 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:02 crc kubenswrapper[4765]: E0319 10:24:02.356236 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.376801 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.396100 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.412273 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.426359 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.460017 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: E0319 10:24:02.480393 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.485200 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.508072 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.522882 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.539225 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.565137 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:57Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0319 10:23:57.018257 6907 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.578779 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.599465 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.617034 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.630927 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.648837 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.665105 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:02 crc kubenswrapper[4765]: I0319 10:24:02.684920 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:02Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:03 crc kubenswrapper[4765]: I0319 10:24:03.355143 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:03 crc kubenswrapper[4765]: E0319 10:24:03.355285 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:03 crc kubenswrapper[4765]: I0319 10:24:03.355147 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:03 crc kubenswrapper[4765]: I0319 10:24:03.355143 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:03 crc kubenswrapper[4765]: E0319 10:24:03.355365 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:03 crc kubenswrapper[4765]: E0319 10:24:03.355514 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:04 crc kubenswrapper[4765]: I0319 10:24:04.355999 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:04 crc kubenswrapper[4765]: E0319 10:24:04.356138 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:05 crc kubenswrapper[4765]: I0319 10:24:05.355362 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:05 crc kubenswrapper[4765]: I0319 10:24:05.355431 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:05 crc kubenswrapper[4765]: I0319 10:24:05.355384 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:05 crc kubenswrapper[4765]: E0319 10:24:05.355574 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:05 crc kubenswrapper[4765]: E0319 10:24:05.355635 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:05 crc kubenswrapper[4765]: E0319 10:24:05.355694 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:06 crc kubenswrapper[4765]: I0319 10:24:06.355975 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:06 crc kubenswrapper[4765]: E0319 10:24:06.356104 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:07 crc kubenswrapper[4765]: I0319 10:24:07.355496 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:07 crc kubenswrapper[4765]: I0319 10:24:07.355538 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:07 crc kubenswrapper[4765]: E0319 10:24:07.355635 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:07 crc kubenswrapper[4765]: I0319 10:24:07.355906 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:07 crc kubenswrapper[4765]: E0319 10:24:07.356058 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:07 crc kubenswrapper[4765]: E0319 10:24:07.355824 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:07 crc kubenswrapper[4765]: E0319 10:24:07.481184 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:08 crc kubenswrapper[4765]: I0319 10:24:08.356179 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:08 crc kubenswrapper[4765]: E0319 10:24:08.356393 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:09 crc kubenswrapper[4765]: I0319 10:24:09.355227 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:09 crc kubenswrapper[4765]: I0319 10:24:09.355228 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:09 crc kubenswrapper[4765]: E0319 10:24:09.355452 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:09 crc kubenswrapper[4765]: E0319 10:24:09.355575 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:09 crc kubenswrapper[4765]: I0319 10:24:09.355776 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:09 crc kubenswrapper[4765]: E0319 10:24:09.356067 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.355364 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:10 crc kubenswrapper[4765]: E0319 10:24:10.355629 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.968615 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.970193 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.970373 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.970538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.970676 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:10Z","lastTransitionTime":"2026-03-19T10:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:10 crc kubenswrapper[4765]: E0319 10:24:10.992501 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:10Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.999514 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.999607 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.999656 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.999682 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:10 crc kubenswrapper[4765]: I0319 10:24:10.999701 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:10Z","lastTransitionTime":"2026-03-19T10:24:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.021894 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:10Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:11Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.028025 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.028085 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.028098 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.028121 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.028159 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:11Z","lastTransitionTime":"2026-03-19T10:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.048998 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:11Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.054673 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.054714 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.054751 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.054777 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.054792 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:11Z","lastTransitionTime":"2026-03-19T10:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.070437 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:11Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.074940 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.074987 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.074999 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.075020 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.075062 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:11Z","lastTransitionTime":"2026-03-19T10:24:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.090698 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:11Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:11Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.090893 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.355323 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.355473 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:11 crc kubenswrapper[4765]: I0319 10:24:11.356353 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.356571 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.356719 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:11 crc kubenswrapper[4765]: E0319 10:24:11.356839 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.355567 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:12 crc kubenswrapper[4765]: E0319 10:24:12.355755 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.369735 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.383952 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.397405 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.412530 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.425239 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.441115 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.454955 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.466489 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.476472 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: E0319 10:24:12.482137 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.489761 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.512782 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.528385 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.544897 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.568062 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.581044 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.601501 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:57Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0319 10:23:57.018257 6907 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:12 crc kubenswrapper[4765]: I0319 10:24:12.616731 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:12Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:13 crc kubenswrapper[4765]: I0319 10:24:13.356029 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:13 crc kubenswrapper[4765]: I0319 10:24:13.356186 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:13 crc kubenswrapper[4765]: E0319 10:24:13.356233 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:13 crc kubenswrapper[4765]: I0319 10:24:13.356517 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:13 crc kubenswrapper[4765]: E0319 10:24:13.356518 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:13 crc kubenswrapper[4765]: E0319 10:24:13.356664 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:14 crc kubenswrapper[4765]: I0319 10:24:14.355555 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:14 crc kubenswrapper[4765]: E0319 10:24:14.356330 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:14 crc kubenswrapper[4765]: I0319 10:24:14.356772 4765 scope.go:117] "RemoveContainer" containerID="5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.252253 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/1.log" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.256853 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357"} Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.257492 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.271490 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.288276 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.300751 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.313521 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.326276 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.338698 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.350199 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.355151 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:15 crc kubenswrapper[4765]: E0319 10:24:15.355265 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.355423 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:15 crc kubenswrapper[4765]: E0319 10:24:15.355491 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.355708 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:15 crc kubenswrapper[4765]: E0319 10:24:15.355771 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.361029 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.390511 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.405279 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.418219 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.431990 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.448645 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.471842 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:57Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0319 10:23:57.018257 6907 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.485687 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.498164 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:15 crc kubenswrapper[4765]: I0319 10:24:15.507114 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:15Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.024166 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:16 crc kubenswrapper[4765]: E0319 10:24:16.024527 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:24:16 crc kubenswrapper[4765]: E0319 10:24:16.024654 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:24:48.024627454 +0000 UTC m=+186.373573056 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.263412 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/2.log" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.264583 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/1.log" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.268858 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357" exitCode=1 Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.268925 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357"} Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.269001 4765 scope.go:117] "RemoveContainer" containerID="5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.270533 4765 scope.go:117] "RemoveContainer" containerID="4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357" Mar 19 10:24:16 crc kubenswrapper[4765]: E0319 10:24:16.271152 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.296850 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.315836 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.335990 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.350231 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.356263 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:16 crc kubenswrapper[4765]: E0319 10:24:16.356494 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.364549 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.377642 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.391438 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.406418 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.426489 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.443408 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.463931 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://5ef1fe96b949d3cf2ae5a258abc8617e18299491bee755dd5668ac9ec3ff7ba6\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:23:57Z\\\",\\\"message\\\":\\\"ault: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-apiserver/check-endpoints_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-apiserver/check-endpoints\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.139\\\\\\\", Port:17698, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nF0319 10:23:57.018257 6907 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certifi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:56Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.478829 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.492869 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.515834 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.535537 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.552141 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:16 crc kubenswrapper[4765]: I0319 10:24:16.564127 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:16Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.276505 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/2.log" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.281051 4765 scope.go:117] "RemoveContainer" containerID="4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357" Mar 19 10:24:17 crc kubenswrapper[4765]: E0319 10:24:17.281196 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.308397 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.323123 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.345624 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.356093 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.356158 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.356156 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:17 crc kubenswrapper[4765]: E0319 10:24:17.356286 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:17 crc kubenswrapper[4765]: E0319 10:24:17.356482 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:17 crc kubenswrapper[4765]: E0319 10:24:17.356737 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.380309 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.397652 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.415948 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.459122 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: E0319 10:24:17.483434 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.484771 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.508088 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.532851 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.558211 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.572615 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.588113 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.603270 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.617075 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.631517 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:17 crc kubenswrapper[4765]: I0319 10:24:17.648781 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:17Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:18 crc kubenswrapper[4765]: I0319 10:24:18.355203 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:18 crc kubenswrapper[4765]: E0319 10:24:18.355395 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.164330 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.164724 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:23.164663869 +0000 UTC m=+221.513609441 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.165309 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.165484 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.165573 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:25:23.165555438 +0000 UTC m=+221.514501020 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.266391 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.266523 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.266574 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.266595 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.266630 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.266644 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.266695 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:25:23.266678959 +0000 UTC m=+221.615624501 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.266739 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.266834 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:25:23.266814864 +0000 UTC m=+221.615760446 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.267134 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.267222 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.267252 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.267377 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:25:23.267341411 +0000 UTC m=+221.616286993 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.356013 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.356120 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.356218 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.356481 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.356547 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:19 crc kubenswrapper[4765]: E0319 10:24:19.356696 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:19 crc kubenswrapper[4765]: I0319 10:24:19.373927 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 19 10:24:20 crc kubenswrapper[4765]: I0319 10:24:20.355895 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:20 crc kubenswrapper[4765]: E0319 10:24:20.356162 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.355843 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.355886 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.356148 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.356461 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.356537 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.356612 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.486684 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.486748 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.486759 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.486779 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.486791 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:21Z","lastTransitionTime":"2026-03-19T10:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.508784 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:21Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.514534 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.514588 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.514606 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.514635 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.514654 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:21Z","lastTransitionTime":"2026-03-19T10:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.539400 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:21Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.545318 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.545414 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.545444 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.545482 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.545508 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:21Z","lastTransitionTime":"2026-03-19T10:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.568726 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:21Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.574581 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.574662 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.574686 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.574718 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.574739 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:21Z","lastTransitionTime":"2026-03-19T10:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.599664 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:21Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.606339 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.606394 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.606407 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.606429 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:21 crc kubenswrapper[4765]: I0319 10:24:21.606447 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:21Z","lastTransitionTime":"2026-03-19T10:24:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.627042 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:21Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:21Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:21 crc kubenswrapper[4765]: E0319 10:24:21.627195 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.355587 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:22 crc kubenswrapper[4765]: E0319 10:24:22.355794 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.386532 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.404489 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.424214 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.441137 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.457818 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.482334 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: E0319 10:24:22.484929 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.500945 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.516781 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.534792 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.549352 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.563191 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600dd10a-73da-46c1-8583-344b6ad4dfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.585145 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.601526 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.616104 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.642982 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.661035 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.680313 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:22 crc kubenswrapper[4765]: I0319 10:24:22.695244 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:22Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:23 crc kubenswrapper[4765]: I0319 10:24:23.355238 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:23 crc kubenswrapper[4765]: I0319 10:24:23.355328 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:23 crc kubenswrapper[4765]: I0319 10:24:23.355450 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:23 crc kubenswrapper[4765]: E0319 10:24:23.355616 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:23 crc kubenswrapper[4765]: E0319 10:24:23.355711 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:23 crc kubenswrapper[4765]: E0319 10:24:23.355945 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:24 crc kubenswrapper[4765]: I0319 10:24:24.355130 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:24 crc kubenswrapper[4765]: E0319 10:24:24.355295 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:25 crc kubenswrapper[4765]: I0319 10:24:25.356095 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:25 crc kubenswrapper[4765]: I0319 10:24:25.356182 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:25 crc kubenswrapper[4765]: I0319 10:24:25.356182 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:25 crc kubenswrapper[4765]: E0319 10:24:25.356302 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:25 crc kubenswrapper[4765]: E0319 10:24:25.356526 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:25 crc kubenswrapper[4765]: E0319 10:24:25.356711 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:26 crc kubenswrapper[4765]: I0319 10:24:26.356151 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:26 crc kubenswrapper[4765]: E0319 10:24:26.356444 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:27 crc kubenswrapper[4765]: I0319 10:24:27.355424 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:27 crc kubenswrapper[4765]: E0319 10:24:27.355660 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:27 crc kubenswrapper[4765]: I0319 10:24:27.355753 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:27 crc kubenswrapper[4765]: I0319 10:24:27.355778 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:27 crc kubenswrapper[4765]: E0319 10:24:27.356108 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:27 crc kubenswrapper[4765]: E0319 10:24:27.356187 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:27 crc kubenswrapper[4765]: E0319 10:24:27.487281 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:28 crc kubenswrapper[4765]: I0319 10:24:28.356472 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:28 crc kubenswrapper[4765]: E0319 10:24:28.356772 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:29 crc kubenswrapper[4765]: I0319 10:24:29.356204 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:29 crc kubenswrapper[4765]: I0319 10:24:29.356317 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:29 crc kubenswrapper[4765]: I0319 10:24:29.356598 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:29 crc kubenswrapper[4765]: E0319 10:24:29.356947 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:29 crc kubenswrapper[4765]: E0319 10:24:29.357066 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:29 crc kubenswrapper[4765]: E0319 10:24:29.357100 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:29 crc kubenswrapper[4765]: I0319 10:24:29.379095 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 19 10:24:30 crc kubenswrapper[4765]: I0319 10:24:30.355117 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:30 crc kubenswrapper[4765]: E0319 10:24:30.355266 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.355506 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.355593 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.355506 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.355719 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.356396 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.356795 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.356840 4765 scope.go:117] "RemoveContainer" containerID="4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357" Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.357118 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.775502 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.775558 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.775575 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.775599 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.775618 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:31Z","lastTransitionTime":"2026-03-19T10:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.799145 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.804306 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.804348 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.804358 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.804374 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.804386 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:31Z","lastTransitionTime":"2026-03-19T10:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.827895 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.833536 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.833585 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.833596 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.833621 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.833634 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:31Z","lastTransitionTime":"2026-03-19T10:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.855374 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.861478 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.861570 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.861600 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.861642 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.861675 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:31Z","lastTransitionTime":"2026-03-19T10:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.886369 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.892518 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.892591 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.892610 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.892649 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:31 crc kubenswrapper[4765]: I0319 10:24:31.892671 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:31Z","lastTransitionTime":"2026-03-19T10:24:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.912949 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:31Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:31Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:31 crc kubenswrapper[4765]: E0319 10:24:31.913118 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.355302 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:32 crc kubenswrapper[4765]: E0319 10:24:32.355583 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.373070 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.388856 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.406378 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.417841 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600dd10a-73da-46c1-8583-344b6ad4dfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.430778 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.451404 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.469899 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: E0319 10:24:32.488735 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.490983 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94e626d-90aa-4f89-a16b-d7e1c9eaaeae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://521a28023428a22264f3da9e2eb80eb1726c4ce7fc6c3bedcc4aa4fece1dce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 10:22:14.156952 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 10:22:14.157988 1 observer_polling.go:159] Starting file observer\\\\nI0319 10:22:14.159139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 10:22:14.159830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 10:22:43.796213 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 10:22:43.796281 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:14Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a87713adc9f4b89e52a3a22705557d52f4258ca4d042b27d0fc0826ae5dbd02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86f2e0fb32cf6d35487d16c0464a828889fefc55528da6b8de5f2d70fd3273e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.511697 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.527273 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.546492 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.563131 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.586690 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.600981 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.614083 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.641298 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.656493 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.669942 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:32 crc kubenswrapper[4765]: I0319 10:24:32.688680 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:32Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:33 crc kubenswrapper[4765]: I0319 10:24:33.356138 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:33 crc kubenswrapper[4765]: I0319 10:24:33.356164 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:33 crc kubenswrapper[4765]: E0319 10:24:33.356288 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:33 crc kubenswrapper[4765]: E0319 10:24:33.356657 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:33 crc kubenswrapper[4765]: I0319 10:24:33.357074 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:33 crc kubenswrapper[4765]: E0319 10:24:33.357206 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.355699 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:34 crc kubenswrapper[4765]: E0319 10:24:34.355992 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.364536 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/0.log" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.364627 4765 generic.go:334] "Generic (PLEG): container finished" podID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" containerID="10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0" exitCode=1 Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.366619 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerDied","Data":"10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0"} Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.367206 4765 scope.go:117] "RemoveContainer" containerID="10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.385852 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.406380 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.434557 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600dd10a-73da-46c1-8583-344b6ad4dfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.457719 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.479792 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.499017 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.529188 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.544147 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94e626d-90aa-4f89-a16b-d7e1c9eaaeae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://521a28023428a22264f3da9e2eb80eb1726c4ce7fc6c3bedcc4aa4fece1dce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 10:22:14.156952 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 10:22:14.157988 1 observer_polling.go:159] Starting file observer\\\\nI0319 10:22:14.159139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 10:22:14.159830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 10:22:43.796213 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 10:22:43.796281 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:14Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a87713adc9f4b89e52a3a22705557d52f4258ca4d042b27d0fc0826ae5dbd02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86f2e0fb32cf6d35487d16c0464a828889fefc55528da6b8de5f2d70fd3273e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.563171 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.578141 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.593430 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.617889 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.629099 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.641483 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.660634 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.674884 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.688991 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.703282 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:34 crc kubenswrapper[4765]: I0319 10:24:34.718519 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:34Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:33Z\\\",\\\"message\\\":\\\"2026-03-19T10:23:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4\\\\n2026-03-19T10:23:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4 to /host/opt/cni/bin/\\\\n2026-03-19T10:23:48Z [verbose] multus-daemon started\\\\n2026-03-19T10:23:48Z [verbose] Readiness Indicator file check\\\\n2026-03-19T10:24:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:34Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.355774 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.355924 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.355788 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:35 crc kubenswrapper[4765]: E0319 10:24:35.356166 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:35 crc kubenswrapper[4765]: E0319 10:24:35.356316 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:35 crc kubenswrapper[4765]: E0319 10:24:35.356507 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.373814 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/0.log" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.374132 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerStarted","Data":"c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2"} Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.397563 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94e626d-90aa-4f89-a16b-d7e1c9eaaeae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://521a28023428a22264f3da9e2eb80eb1726c4ce7fc6c3bedcc4aa4fece1dce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 10:22:14.156952 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 10:22:14.157988 1 observer_polling.go:159] Starting file observer\\\\nI0319 10:22:14.159139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 10:22:14.159830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 10:22:43.796213 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 10:22:43.796281 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:14Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a87713adc9f4b89e52a3a22705557d52f4258ca4d042b27d0fc0826ae5dbd02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86f2e0fb32cf6d35487d16c0464a828889fefc55528da6b8de5f2d70fd3273e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.415691 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.437076 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.455497 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.527757 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.555529 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.574526 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.592551 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.606773 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:33Z\\\",\\\"message\\\":\\\"2026-03-19T10:23:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4\\\\n2026-03-19T10:23:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4 to /host/opt/cni/bin/\\\\n2026-03-19T10:23:48Z [verbose] multus-daemon started\\\\n2026-03-19T10:23:48Z [verbose] Readiness Indicator file check\\\\n2026-03-19T10:24:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.629280 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.640662 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.652025 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.668913 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.683790 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.698462 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600dd10a-73da-46c1-8583-344b6ad4dfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.713459 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.729143 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.742010 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:35 crc kubenswrapper[4765]: I0319 10:24:35.758696 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:35Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:36 crc kubenswrapper[4765]: I0319 10:24:36.356502 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:36 crc kubenswrapper[4765]: E0319 10:24:36.356766 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:37 crc kubenswrapper[4765]: I0319 10:24:37.355527 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:37 crc kubenswrapper[4765]: E0319 10:24:37.355689 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:37 crc kubenswrapper[4765]: I0319 10:24:37.355869 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:37 crc kubenswrapper[4765]: I0319 10:24:37.356002 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:37 crc kubenswrapper[4765]: E0319 10:24:37.356204 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:37 crc kubenswrapper[4765]: E0319 10:24:37.356493 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:37 crc kubenswrapper[4765]: E0319 10:24:37.490113 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:38 crc kubenswrapper[4765]: I0319 10:24:38.358785 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:38 crc kubenswrapper[4765]: E0319 10:24:38.359051 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:39 crc kubenswrapper[4765]: I0319 10:24:39.355490 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:39 crc kubenswrapper[4765]: I0319 10:24:39.355540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:39 crc kubenswrapper[4765]: I0319 10:24:39.355540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:39 crc kubenswrapper[4765]: E0319 10:24:39.355708 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:39 crc kubenswrapper[4765]: E0319 10:24:39.355845 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:39 crc kubenswrapper[4765]: E0319 10:24:39.356024 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:40 crc kubenswrapper[4765]: I0319 10:24:40.355556 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:40 crc kubenswrapper[4765]: E0319 10:24:40.355842 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:41 crc kubenswrapper[4765]: I0319 10:24:41.356176 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:41 crc kubenswrapper[4765]: I0319 10:24:41.356177 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:41 crc kubenswrapper[4765]: E0319 10:24:41.356351 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:41 crc kubenswrapper[4765]: E0319 10:24:41.356390 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:41 crc kubenswrapper[4765]: I0319 10:24:41.356176 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:41 crc kubenswrapper[4765]: E0319 10:24:41.356467 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.090770 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.090836 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.090846 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.090866 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.090877 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:42Z","lastTransitionTime":"2026-03-19T10:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.110373 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.117538 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.117579 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.117593 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.117613 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.117629 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:42Z","lastTransitionTime":"2026-03-19T10:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.135129 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.140374 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.140415 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.140428 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.140451 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.140463 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:42Z","lastTransitionTime":"2026-03-19T10:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.157265 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.162149 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.162215 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.162229 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.162252 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.162265 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:42Z","lastTransitionTime":"2026-03-19T10:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.178294 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.183079 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.183126 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.183138 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.183157 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.183173 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:42Z","lastTransitionTime":"2026-03-19T10:24:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.197914 4765 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:42Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"f831093e-daf4-4112-8683-64c2fcb4a46e\\\",\\\"systemUUID\\\":\\\"7efe4e5f-b64c-4a0b-8f3f-69f763aea23b\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.198085 4765 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.355559 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.356212 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.356484 4765 scope.go:117] "RemoveContainer" containerID="4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.376885 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94e626d-90aa-4f89-a16b-d7e1c9eaaeae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://521a28023428a22264f3da9e2eb80eb1726c4ce7fc6c3bedcc4aa4fece1dce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 10:22:14.156952 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 10:22:14.157988 1 observer_polling.go:159] Starting file observer\\\\nI0319 10:22:14.159139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 10:22:14.159830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 10:22:43.796213 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 10:22:43.796281 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:14Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a87713adc9f4b89e52a3a22705557d52f4258ca4d042b27d0fc0826ae5dbd02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86f2e0fb32cf6d35487d16c0464a828889fefc55528da6b8de5f2d70fd3273e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.399345 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.422779 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.439928 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.456591 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:33Z\\\",\\\"message\\\":\\\"2026-03-19T10:23:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4\\\\n2026-03-19T10:23:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4 to /host/opt/cni/bin/\\\\n2026-03-19T10:23:48Z [verbose] multus-daemon started\\\\n2026-03-19T10:23:48Z [verbose] Readiness Indicator file check\\\\n2026-03-19T10:24:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.483860 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: E0319 10:24:42.491220 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.504886 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.519057 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.557108 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.576932 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.593293 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.607064 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.623579 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.635468 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.658714 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.672214 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600dd10a-73da-46c1-8583-344b6ad4dfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.687628 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.707537 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:42 crc kubenswrapper[4765]: I0319 10:24:42.723549 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:42Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.356091 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.356091 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.356150 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:43 crc kubenswrapper[4765]: E0319 10:24:43.356744 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:43 crc kubenswrapper[4765]: E0319 10:24:43.357042 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:43 crc kubenswrapper[4765]: E0319 10:24:43.357177 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.409089 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/2.log" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.411977 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9"} Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.412500 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.430565 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.451861 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.468683 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.483874 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94e626d-90aa-4f89-a16b-d7e1c9eaaeae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://521a28023428a22264f3da9e2eb80eb1726c4ce7fc6c3bedcc4aa4fece1dce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 10:22:14.156952 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 10:22:14.157988 1 observer_polling.go:159] Starting file observer\\\\nI0319 10:22:14.159139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 10:22:14.159830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 10:22:43.796213 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 10:22:43.796281 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:14Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a87713adc9f4b89e52a3a22705557d52f4258ca4d042b27d0fc0826ae5dbd02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86f2e0fb32cf6d35487d16c0464a828889fefc55528da6b8de5f2d70fd3273e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.496671 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.509658 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.522019 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.538181 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:33Z\\\",\\\"message\\\":\\\"2026-03-19T10:23:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4\\\\n2026-03-19T10:23:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4 to /host/opt/cni/bin/\\\\n2026-03-19T10:23:48Z [verbose] multus-daemon started\\\\n2026-03-19T10:23:48Z [verbose] Readiness Indicator file check\\\\n2026-03-19T10:24:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.559691 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.579838 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.596229 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.627112 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.642915 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.660244 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.676768 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.691598 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.705079 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.731354 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:43 crc kubenswrapper[4765]: I0319 10:24:43.745575 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600dd10a-73da-46c1-8583-344b6ad4dfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.355158 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:44 crc kubenswrapper[4765]: E0319 10:24:44.355358 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.418533 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/3.log" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.419699 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/2.log" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.423274 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" exitCode=1 Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.423342 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9"} Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.423409 4765 scope.go:117] "RemoveContainer" containerID="4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.424575 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:24:44 crc kubenswrapper[4765]: E0319 10:24:44.425472 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.459411 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-wcdqx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c2b2ebd2-48e2-431e-a91d-faa3fc4f3965\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://74f2fddfd0d16c709529cd25dd4c393e3f81f3220af523d3fcd117a366885929\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-4sdtb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:30Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-wcdqx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.481046 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.494603 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e198838b-852b-411d-a205-770108496086\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://0acf077476928b4db840333639157f96965acc6d25d118888764106552b1ff9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b76888e5f37004c49dae7b699384a01a23ac2866133683b5e13ddf0b91d630ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bbd666081c0a630db74c6ffdf5096fc9c108200f417655b616da8670ff161916\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://831ce50afa10ecee18313f3826607c200bcc970f146374068901a987b0fff33f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.515478 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.536056 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.563302 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.580557 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"600dd10a-73da-46c1-8583-344b6ad4dfbc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://90c9e97886fc75e766ec52c9eacb1c11e05d38f7bc2a5b5b1529561c48d7199d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://1b257842b527d186b86c9ad25f9f7421e83f8b50eda3ec632138818d8ff01593\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.604752 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://15c753097be6f11c9f4393831749947aeac91465b4600fcb163894c7bbe57ecf\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.628354 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.647877 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e0a1287f-75c3-4e89-899e-d0cdd6575f9c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4a30cb301ead9f45d5f6426e0b1bbf54ad0bd4788a69b6fd8468b5e5db5ff6ad\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://44ee46c5aef7fb65d271287bb29010a30473aed207994ef37787ccfad61cd2b6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-nf4m5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:43Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kv7q9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.669113 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a94e626d-90aa-4f89-a16b-d7e1c9eaaeae\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:01Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://521a28023428a22264f3da9e2eb80eb1726c4ce7fc6c3bedcc4aa4fece1dce52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://24f8fe40e3530732fc9f7682bdce072be0f8d05de0373e6c3a7157d00d4001f2\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:43Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0319 10:22:14.156952 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0319 10:22:14.157988 1 observer_polling.go:159] Starting file observer\\\\nI0319 10:22:14.159139 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0319 10:22:14.159830 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0319 10:22:43.796213 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0319 10:22:43.796281 1 cmd.go:179] failed checking apiserver connectivity: Unauthorized\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:14Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:22:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://3a87713adc9f4b89e52a3a22705557d52f4258ca4d042b27d0fc0826ae5dbd02\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://86f2e0fb32cf6d35487d16c0464a828889fefc55528da6b8de5f2d70fd3273e0\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.687820 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"006f7d04-2c90-47e9-983d-45318e2fc84e\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:59Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-19T10:22:54Z\\\",\\\"message\\\":\\\"le observer\\\\nW0319 10:22:54.055824 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0319 10:22:54.057583 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0319 10:22:54.061336 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3002746326/tls.crt::/tmp/serving-cert-3002746326/tls.key\\\\\\\"\\\\nI0319 10:22:54.323035 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0319 10:22:54.325787 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0319 10:22:54.325804 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0319 10:22:54.325826 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0319 10:22:54.325831 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0319 10:22:54.332407 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nI0319 10:22:54.332420 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nW0319 10:22:54.332436 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332442 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0319 10:22:54.332446 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0319 10:22:54.332450 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0319 10:22:54.332454 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0319 10:22:54.332457 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nF0319 10:22:54.334421 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:22:53Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.708822 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a6d2de5042ce3afb36291006d0ca45e8a38bcff239b3b6891e90f0db6e08ffe4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.723194 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:39Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b6ee00d00c5c8526dd0adb87162fd15dccec700fefc094ebfcaee404e700e7e8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f7294342094de0f443daf1a1adff9c56023383073370ed73629e6daf3c8f5f61\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.744540 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-mmrh7" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d9d027fd-4e70-4daf-9dd2-adefcc2a868f\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:24:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:33Z\\\",\\\"message\\\":\\\"2026-03-19T10:23:48+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4\\\\n2026-03-19T10:23:48+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_0085bfb7-44ab-451d-beea-360c3ab5abb4 to /host/opt/cni/bin/\\\\n2026-03-19T10:23:48Z [verbose] multus-daemon started\\\\n2026-03-19T10:23:48Z [verbose] Readiness Indicator file check\\\\n2026-03-19T10:24:33Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:24:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-98wwh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-mmrh7\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.770920 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"71cc276b-f25c-460b-b718-f058cc1d2521\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4a3ce8f4fdb7a989726022fd26e1f365488a9ba9db242035ccfaccd92f600357\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:15Z\\\",\\\"message\\\":\\\"efault, existing lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"20da2226-531c-4179-9810-aa4026995ca3\\\\\\\", Protocol:\\\\\\\"tcp\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:false, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{}, Templates:services.TemplateMap{}, Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}, built lbs: []services.LB{services.LB{Name:\\\\\\\"Service_openshift-marketplace/certified-operators_TCP_cluster\\\\\\\", UUID:\\\\\\\"\\\\\\\", Protocol:\\\\\\\"TCP\\\\\\\", ExternalIDs:map[string]string{\\\\\\\"k8s.ovn.org/kind\\\\\\\":\\\\\\\"Service\\\\\\\", \\\\\\\"k8s.ovn.org/owner\\\\\\\":\\\\\\\"openshift-marketplace/certified-operators\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.214\\\\\\\", Port:50051, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0319 10:24:15.252311 7122 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0319 10:24:15.252364 7122 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:14Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-19T10:24:43Z\\\",\\\"message\\\":\\\"4:43.382715 7456 ovn.go:134] Ensuring zone local for Pod openshift-network-console/networking-console-plugin-85b44fc459-gdk6g in node crc\\\\nI0319 10:24:43.382705 7456 obj_retry.go:303] Retry object setup: *v1.Pod openshift-machine-config-operator/kube-rbac-proxy-crio-crc\\\\nI0319 10:24:43.382725 7456 obj_retry.go:365] Adding new object: *v1.Pod openshift-kube-controller-manager/kube-controller-manager-crc\\\\nF0319 10:24:43.382718 7456 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:43Z is after 2025-08-24T17:21:41Z]\\\\nI0319 10:24:43.382733 7456 obj_retry.go:303] Retry object setup: *v1.Pod openshift-dns/node-resolver\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-19T10:24:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:46Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-q2sdr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-kvv2h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.784686 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-dntfk" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"144a13fc-5921-4106-8a80-210689777cd4\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:37Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://db52fd04351e4df6c5d315e881b9d2496d5e1144d38e678388233dcd9a258394\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-6pl4k\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:37Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-dntfk\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.799700 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ab39cf0a-a301-484b-9328-19acff8edae9\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-j85hd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:44Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-t8k4k\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:44 crc kubenswrapper[4765]: I0319 10:24:44.837351 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6ac87e92-7849-487e-9e5f-3ffe276949d3\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:22:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:21:42Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6a77839e36dffe66fd8a5ccdba128bcb88f38dddb112cb336cc4601e55ede39e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://431fac37e0847d1f0af0b5b6bbb8b590666428db9f823dae32febc2125daf85c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6790ba1b6bb625c70fd7235058fa9b7061308c74f684e513c6861233d6a05529\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://d63c930fb3e88ccb1eccdcb13a7d5b6f68265eac3f2b8f785621d6c6a9a98403\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://89c9aa903c850d331bf8c08c08c8587fbdcd01cad743425c61821953bf4ef4d6\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:21:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cfc872038ffa1f10e10d27a7b6b73e905eb14b063a9a007039b6caa1b93a123c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://bdc8241a037fdeedcc2297442111d38172bc6614ec9f61365b9f7daadb23a552\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:44Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3fbd88867590ffa061323e6eeec09d4f8c9403698ed5867b5e01006bd6e7875d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:21:45Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:21:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:21:42Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:44Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.355490 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.355578 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.355670 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:45 crc kubenswrapper[4765]: E0319 10:24:45.355701 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:45 crc kubenswrapper[4765]: E0319 10:24:45.356047 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:45 crc kubenswrapper[4765]: E0319 10:24:45.356370 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.428843 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/3.log" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.434333 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:24:45 crc kubenswrapper[4765]: E0319 10:24:45.434578 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.452404 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:15Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.466759 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a7d72ad1-7f25-4580-b845-7f66e8f78bff\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a243a4d9adfb5b6af6bd87b68195ed4a111d9481fbaf5273adfeab0f6bff6dab\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2fj62\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-4sj5l\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.483789 4765 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-79fbl" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6231872-d9e9-455e-92f1-51acc5985f6a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:31Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-19T10:23:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://51c76287f7290d9ab59ae4c63cd1879a7b6f4bbc1caa6d01a8b2e602df4d8dec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-19T10:23:53Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d7d42a257940c073ebd326d07c0c19bc6c2f7cf20f3024938a5a42f3674cff7a\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:47Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a84d94002fb075b7158a912263a7cd3ac6cbec601046e728fa48165d3eec6f52\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:48Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:48Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://654c29f057c044f2ef55b32b133a56b91614c6de8517b16664007d691042f144\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:49Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8352b24f126304c38ac9d2a7200be196315a19ccbe9d35bbcf8336122c6ab38c\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:50Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:50Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b2d6377463ad815489f84894ad3ca5acee1af1621d40d53e7dc4240a4644633b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:51Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:51Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c16a2b42beaa7b716cd91f901bda0a68866badbcf45f4050b2146e5efbe391fa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-19T10:23:52Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-19T10:23:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-ww7wr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-19T10:23:31Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-79fbl\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-19T10:24:45Z is after 2025-08-24T17:21:41Z" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.531013 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=26.530942481 podStartE2EDuration="26.530942481s" podCreationTimestamp="2026-03-19 10:24:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.511797323 +0000 UTC m=+183.860742865" watchObservedRunningTime="2026-03-19 10:24:45.530942481 +0000 UTC m=+183.879888063" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.531404 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=61.531394326 podStartE2EDuration="1m1.531394326s" podCreationTimestamp="2026-03-19 10:23:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.531276422 +0000 UTC m=+183.880221974" watchObservedRunningTime="2026-03-19 10:24:45.531394326 +0000 UTC m=+183.880339908" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.585496 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kv7q9" podStartSLOduration=112.585470849 podStartE2EDuration="1m52.585470849s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.568810263 +0000 UTC m=+183.917755815" watchObservedRunningTime="2026-03-19 10:24:45.585470849 +0000 UTC m=+183.934416401" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.608078 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=16.608040761 podStartE2EDuration="16.608040761s" podCreationTimestamp="2026-03-19 10:24:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.585975986 +0000 UTC m=+183.934921548" watchObservedRunningTime="2026-03-19 10:24:45.608040761 +0000 UTC m=+183.956986333" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.669604 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-mmrh7" podStartSLOduration=113.669575672 podStartE2EDuration="1m53.669575672s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.669492839 +0000 UTC m=+184.018438391" watchObservedRunningTime="2026-03-19 10:24:45.669575672 +0000 UTC m=+184.018521214" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.707919 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-dntfk" podStartSLOduration=113.707897139 podStartE2EDuration="1m53.707897139s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.707798536 +0000 UTC m=+184.056744088" watchObservedRunningTime="2026-03-19 10:24:45.707897139 +0000 UTC m=+184.056842691" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.745466 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=78.745438771 podStartE2EDuration="1m18.745438771s" podCreationTimestamp="2026-03-19 10:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.745329057 +0000 UTC m=+184.094274619" watchObservedRunningTime="2026-03-19 10:24:45.745438771 +0000 UTC m=+184.094384313" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.764318 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=78.764266708 podStartE2EDuration="1m18.764266708s" podCreationTimestamp="2026-03-19 10:23:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.763622447 +0000 UTC m=+184.112568009" watchObservedRunningTime="2026-03-19 10:24:45.764266708 +0000 UTC m=+184.113212250" Mar 19 10:24:45 crc kubenswrapper[4765]: I0319 10:24:45.793724 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wcdqx" podStartSLOduration=113.79369771 podStartE2EDuration="1m53.79369771s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:45.793134651 +0000 UTC m=+184.142080193" watchObservedRunningTime="2026-03-19 10:24:45.79369771 +0000 UTC m=+184.142643252" Mar 19 10:24:46 crc kubenswrapper[4765]: I0319 10:24:46.355823 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:46 crc kubenswrapper[4765]: E0319 10:24:46.356260 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:47 crc kubenswrapper[4765]: I0319 10:24:47.355499 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:47 crc kubenswrapper[4765]: I0319 10:24:47.355511 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:47 crc kubenswrapper[4765]: E0319 10:24:47.355930 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:47 crc kubenswrapper[4765]: E0319 10:24:47.356104 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:47 crc kubenswrapper[4765]: I0319 10:24:47.356287 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:47 crc kubenswrapper[4765]: E0319 10:24:47.356576 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:47 crc kubenswrapper[4765]: E0319 10:24:47.492899 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:48 crc kubenswrapper[4765]: I0319 10:24:48.118292 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:48 crc kubenswrapper[4765]: E0319 10:24:48.118562 4765 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:24:48 crc kubenswrapper[4765]: E0319 10:24:48.118909 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs podName:ab39cf0a-a301-484b-9328-19acff8edae9 nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.118881024 +0000 UTC m=+250.467826606 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs") pod "network-metrics-daemon-t8k4k" (UID: "ab39cf0a-a301-484b-9328-19acff8edae9") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 19 10:24:48 crc kubenswrapper[4765]: I0319 10:24:48.356483 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:48 crc kubenswrapper[4765]: E0319 10:24:48.357101 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:49 crc kubenswrapper[4765]: I0319 10:24:49.356024 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:49 crc kubenswrapper[4765]: I0319 10:24:49.356024 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:49 crc kubenswrapper[4765]: I0319 10:24:49.356179 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:49 crc kubenswrapper[4765]: E0319 10:24:49.356517 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:49 crc kubenswrapper[4765]: E0319 10:24:49.356655 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:49 crc kubenswrapper[4765]: E0319 10:24:49.356869 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:50 crc kubenswrapper[4765]: I0319 10:24:50.355540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:50 crc kubenswrapper[4765]: E0319 10:24:50.355827 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:51 crc kubenswrapper[4765]: I0319 10:24:51.355653 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:51 crc kubenswrapper[4765]: I0319 10:24:51.355766 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:51 crc kubenswrapper[4765]: I0319 10:24:51.355781 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:51 crc kubenswrapper[4765]: E0319 10:24:51.355921 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:51 crc kubenswrapper[4765]: E0319 10:24:51.356182 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:51 crc kubenswrapper[4765]: E0319 10:24:51.356328 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.356062 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:52 crc kubenswrapper[4765]: E0319 10:24:52.358104 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.404318 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podStartSLOduration=120.404294767 podStartE2EDuration="2m0.404294767s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:52.404041929 +0000 UTC m=+190.752987491" watchObservedRunningTime="2026-03-19 10:24:52.404294767 +0000 UTC m=+190.753240309" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.418316 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.418362 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.418376 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.418395 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.418408 4765 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-19T10:24:52Z","lastTransitionTime":"2026-03-19T10:24:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.438822 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-79fbl" podStartSLOduration=120.438750175 podStartE2EDuration="2m0.438750175s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:52.432806847 +0000 UTC m=+190.781752389" watchObservedRunningTime="2026-03-19 10:24:52.438750175 +0000 UTC m=+190.787695757" Mar 19 10:24:52 crc kubenswrapper[4765]: E0319 10:24:52.494245 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.495675 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj"] Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.496664 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.499948 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.500379 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.500863 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.504714 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.516983 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.532587 4765 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.574004 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815d7810-65ef-49d0-a4d3-2b0410e666d8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.574370 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/815d7810-65ef-49d0-a4d3-2b0410e666d8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.574467 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/815d7810-65ef-49d0-a4d3-2b0410e666d8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.574562 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/815d7810-65ef-49d0-a4d3-2b0410e666d8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.574648 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/815d7810-65ef-49d0-a4d3-2b0410e666d8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.675408 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/815d7810-65ef-49d0-a4d3-2b0410e666d8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.675479 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/815d7810-65ef-49d0-a4d3-2b0410e666d8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.675522 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/815d7810-65ef-49d0-a4d3-2b0410e666d8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.675602 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815d7810-65ef-49d0-a4d3-2b0410e666d8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.675641 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/815d7810-65ef-49d0-a4d3-2b0410e666d8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.675917 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/815d7810-65ef-49d0-a4d3-2b0410e666d8-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.676014 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/815d7810-65ef-49d0-a4d3-2b0410e666d8-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.677820 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/815d7810-65ef-49d0-a4d3-2b0410e666d8-service-ca\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.687041 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/815d7810-65ef-49d0-a4d3-2b0410e666d8-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.696791 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/815d7810-65ef-49d0-a4d3-2b0410e666d8-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-lm6qj\" (UID: \"815d7810-65ef-49d0-a4d3-2b0410e666d8\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: I0319 10:24:52.817925 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" Mar 19 10:24:52 crc kubenswrapper[4765]: W0319 10:24:52.836611 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod815d7810_65ef_49d0_a4d3_2b0410e666d8.slice/crio-72991430f5466e659e415cf2822246501aa8c1d373c8d6d438bfcde11c6edd42 WatchSource:0}: Error finding container 72991430f5466e659e415cf2822246501aa8c1d373c8d6d438bfcde11c6edd42: Status 404 returned error can't find the container with id 72991430f5466e659e415cf2822246501aa8c1d373c8d6d438bfcde11c6edd42 Mar 19 10:24:53 crc kubenswrapper[4765]: I0319 10:24:53.355445 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:53 crc kubenswrapper[4765]: E0319 10:24:53.355653 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:53 crc kubenswrapper[4765]: I0319 10:24:53.355687 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:53 crc kubenswrapper[4765]: E0319 10:24:53.355769 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:53 crc kubenswrapper[4765]: I0319 10:24:53.355868 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:53 crc kubenswrapper[4765]: E0319 10:24:53.356162 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:53 crc kubenswrapper[4765]: I0319 10:24:53.471608 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" event={"ID":"815d7810-65ef-49d0-a4d3-2b0410e666d8","Type":"ContainerStarted","Data":"7db8a7b16f02256d3876f20dd7b63155ff8dd281bcb493b7dff124b5a7ae3f6e"} Mar 19 10:24:53 crc kubenswrapper[4765]: I0319 10:24:53.471680 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" event={"ID":"815d7810-65ef-49d0-a4d3-2b0410e666d8","Type":"ContainerStarted","Data":"72991430f5466e659e415cf2822246501aa8c1d373c8d6d438bfcde11c6edd42"} Mar 19 10:24:53 crc kubenswrapper[4765]: I0319 10:24:53.491114 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-lm6qj" podStartSLOduration=121.491082003 podStartE2EDuration="2m1.491082003s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:24:53.490504493 +0000 UTC m=+191.839450035" watchObservedRunningTime="2026-03-19 10:24:53.491082003 +0000 UTC m=+191.840027585" Mar 19 10:24:54 crc kubenswrapper[4765]: I0319 10:24:54.356274 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:54 crc kubenswrapper[4765]: E0319 10:24:54.356477 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:55 crc kubenswrapper[4765]: I0319 10:24:55.355888 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:55 crc kubenswrapper[4765]: I0319 10:24:55.355916 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:55 crc kubenswrapper[4765]: E0319 10:24:55.356098 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:55 crc kubenswrapper[4765]: E0319 10:24:55.356280 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:55 crc kubenswrapper[4765]: I0319 10:24:55.355911 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:55 crc kubenswrapper[4765]: E0319 10:24:55.357019 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:56 crc kubenswrapper[4765]: I0319 10:24:56.355805 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:56 crc kubenswrapper[4765]: E0319 10:24:56.356040 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:57 crc kubenswrapper[4765]: I0319 10:24:57.356309 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:57 crc kubenswrapper[4765]: I0319 10:24:57.356419 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:57 crc kubenswrapper[4765]: I0319 10:24:57.356310 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:57 crc kubenswrapper[4765]: E0319 10:24:57.356625 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:57 crc kubenswrapper[4765]: E0319 10:24:57.356839 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:57 crc kubenswrapper[4765]: E0319 10:24:57.357079 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:57 crc kubenswrapper[4765]: E0319 10:24:57.495942 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:24:58 crc kubenswrapper[4765]: I0319 10:24:58.356120 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:24:58 crc kubenswrapper[4765]: E0319 10:24:58.356353 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:24:59 crc kubenswrapper[4765]: I0319 10:24:59.355277 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:24:59 crc kubenswrapper[4765]: I0319 10:24:59.355277 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:24:59 crc kubenswrapper[4765]: I0319 10:24:59.355308 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:24:59 crc kubenswrapper[4765]: E0319 10:24:59.355636 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:24:59 crc kubenswrapper[4765]: E0319 10:24:59.355864 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:24:59 crc kubenswrapper[4765]: E0319 10:24:59.356679 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:24:59 crc kubenswrapper[4765]: I0319 10:24:59.357296 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:24:59 crc kubenswrapper[4765]: E0319 10:24:59.357571 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:25:00 crc kubenswrapper[4765]: I0319 10:25:00.355305 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:00 crc kubenswrapper[4765]: E0319 10:25:00.355567 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:01 crc kubenswrapper[4765]: I0319 10:25:01.355355 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:01 crc kubenswrapper[4765]: I0319 10:25:01.355432 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:01 crc kubenswrapper[4765]: E0319 10:25:01.355623 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:01 crc kubenswrapper[4765]: I0319 10:25:01.355468 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:01 crc kubenswrapper[4765]: E0319 10:25:01.355858 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:01 crc kubenswrapper[4765]: E0319 10:25:01.356460 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:02 crc kubenswrapper[4765]: I0319 10:25:02.355927 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:02 crc kubenswrapper[4765]: E0319 10:25:02.358197 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:02 crc kubenswrapper[4765]: E0319 10:25:02.497535 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:25:03 crc kubenswrapper[4765]: I0319 10:25:03.355745 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:03 crc kubenswrapper[4765]: I0319 10:25:03.355858 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:03 crc kubenswrapper[4765]: I0319 10:25:03.355820 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:03 crc kubenswrapper[4765]: E0319 10:25:03.356327 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:03 crc kubenswrapper[4765]: E0319 10:25:03.356433 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:03 crc kubenswrapper[4765]: E0319 10:25:03.356572 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:04 crc kubenswrapper[4765]: I0319 10:25:04.356098 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:04 crc kubenswrapper[4765]: E0319 10:25:04.356435 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:05 crc kubenswrapper[4765]: I0319 10:25:05.355670 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:05 crc kubenswrapper[4765]: I0319 10:25:05.355706 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:05 crc kubenswrapper[4765]: E0319 10:25:05.355904 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:05 crc kubenswrapper[4765]: I0319 10:25:05.356044 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:05 crc kubenswrapper[4765]: E0319 10:25:05.356259 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:05 crc kubenswrapper[4765]: E0319 10:25:05.356350 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:06 crc kubenswrapper[4765]: I0319 10:25:06.356083 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:06 crc kubenswrapper[4765]: E0319 10:25:06.356279 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:07 crc kubenswrapper[4765]: I0319 10:25:07.355215 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:07 crc kubenswrapper[4765]: I0319 10:25:07.355230 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:07 crc kubenswrapper[4765]: I0319 10:25:07.355227 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:07 crc kubenswrapper[4765]: E0319 10:25:07.355560 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:07 crc kubenswrapper[4765]: E0319 10:25:07.355710 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:07 crc kubenswrapper[4765]: E0319 10:25:07.355849 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:07 crc kubenswrapper[4765]: E0319 10:25:07.498835 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:25:08 crc kubenswrapper[4765]: I0319 10:25:08.355173 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:08 crc kubenswrapper[4765]: E0319 10:25:08.355471 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:09 crc kubenswrapper[4765]: I0319 10:25:09.355580 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:09 crc kubenswrapper[4765]: I0319 10:25:09.355653 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:09 crc kubenswrapper[4765]: I0319 10:25:09.355616 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:09 crc kubenswrapper[4765]: E0319 10:25:09.355818 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:09 crc kubenswrapper[4765]: E0319 10:25:09.356023 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:09 crc kubenswrapper[4765]: E0319 10:25:09.356115 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:10 crc kubenswrapper[4765]: I0319 10:25:10.356048 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:10 crc kubenswrapper[4765]: E0319 10:25:10.356317 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:11 crc kubenswrapper[4765]: I0319 10:25:11.355901 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:11 crc kubenswrapper[4765]: E0319 10:25:11.356118 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:11 crc kubenswrapper[4765]: I0319 10:25:11.356228 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:11 crc kubenswrapper[4765]: I0319 10:25:11.356275 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:11 crc kubenswrapper[4765]: E0319 10:25:11.356394 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:11 crc kubenswrapper[4765]: E0319 10:25:11.356533 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:12 crc kubenswrapper[4765]: I0319 10:25:12.355327 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:12 crc kubenswrapper[4765]: E0319 10:25:12.355558 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:12 crc kubenswrapper[4765]: E0319 10:25:12.499570 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:25:13 crc kubenswrapper[4765]: I0319 10:25:13.356255 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:13 crc kubenswrapper[4765]: I0319 10:25:13.356273 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:13 crc kubenswrapper[4765]: I0319 10:25:13.356927 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:13 crc kubenswrapper[4765]: E0319 10:25:13.357121 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:13 crc kubenswrapper[4765]: E0319 10:25:13.357259 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:13 crc kubenswrapper[4765]: E0319 10:25:13.357505 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:14 crc kubenswrapper[4765]: I0319 10:25:14.355232 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:14 crc kubenswrapper[4765]: E0319 10:25:14.355391 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:14 crc kubenswrapper[4765]: I0319 10:25:14.356723 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:25:14 crc kubenswrapper[4765]: E0319 10:25:14.356912 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-kvv2h_openshift-ovn-kubernetes(71cc276b-f25c-460b-b718-f058cc1d2521)\"" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" Mar 19 10:25:15 crc kubenswrapper[4765]: I0319 10:25:15.355707 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:15 crc kubenswrapper[4765]: I0319 10:25:15.355805 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:15 crc kubenswrapper[4765]: I0319 10:25:15.355862 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:15 crc kubenswrapper[4765]: E0319 10:25:15.355897 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:15 crc kubenswrapper[4765]: E0319 10:25:15.356110 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:15 crc kubenswrapper[4765]: E0319 10:25:15.356192 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:16 crc kubenswrapper[4765]: I0319 10:25:16.355877 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:16 crc kubenswrapper[4765]: E0319 10:25:16.356096 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:17 crc kubenswrapper[4765]: I0319 10:25:17.355934 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:17 crc kubenswrapper[4765]: I0319 10:25:17.355934 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:17 crc kubenswrapper[4765]: I0319 10:25:17.355951 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:17 crc kubenswrapper[4765]: E0319 10:25:17.356098 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:17 crc kubenswrapper[4765]: E0319 10:25:17.356165 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:17 crc kubenswrapper[4765]: E0319 10:25:17.356229 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:17 crc kubenswrapper[4765]: E0319 10:25:17.501405 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:25:18 crc kubenswrapper[4765]: I0319 10:25:18.356196 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:18 crc kubenswrapper[4765]: E0319 10:25:18.356459 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:19 crc kubenswrapper[4765]: I0319 10:25:19.355832 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:19 crc kubenswrapper[4765]: I0319 10:25:19.356015 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:19 crc kubenswrapper[4765]: E0319 10:25:19.356135 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:19 crc kubenswrapper[4765]: I0319 10:25:19.356221 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:19 crc kubenswrapper[4765]: E0319 10:25:19.356492 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:19 crc kubenswrapper[4765]: E0319 10:25:19.356594 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:20 crc kubenswrapper[4765]: I0319 10:25:20.356390 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:20 crc kubenswrapper[4765]: E0319 10:25:20.356611 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:20 crc kubenswrapper[4765]: I0319 10:25:20.571072 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/1.log" Mar 19 10:25:20 crc kubenswrapper[4765]: I0319 10:25:20.571904 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/0.log" Mar 19 10:25:20 crc kubenswrapper[4765]: I0319 10:25:20.571991 4765 generic.go:334] "Generic (PLEG): container finished" podID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" containerID="c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2" exitCode=1 Mar 19 10:25:20 crc kubenswrapper[4765]: I0319 10:25:20.572037 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerDied","Data":"c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2"} Mar 19 10:25:20 crc kubenswrapper[4765]: I0319 10:25:20.572082 4765 scope.go:117] "RemoveContainer" containerID="10344ab3f77b5519aa3d3f2f6265920da8b5f154b939d7e4b8f607a199569cb0" Mar 19 10:25:20 crc kubenswrapper[4765]: I0319 10:25:20.572730 4765 scope.go:117] "RemoveContainer" containerID="c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2" Mar 19 10:25:20 crc kubenswrapper[4765]: E0319 10:25:20.573034 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-mmrh7_openshift-multus(d9d027fd-4e70-4daf-9dd2-adefcc2a868f)\"" pod="openshift-multus/multus-mmrh7" podUID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" Mar 19 10:25:21 crc kubenswrapper[4765]: I0319 10:25:21.356257 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:21 crc kubenswrapper[4765]: I0319 10:25:21.356407 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:21 crc kubenswrapper[4765]: I0319 10:25:21.356385 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:21 crc kubenswrapper[4765]: E0319 10:25:21.356587 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:21 crc kubenswrapper[4765]: E0319 10:25:21.356776 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:21 crc kubenswrapper[4765]: E0319 10:25:21.356902 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:21 crc kubenswrapper[4765]: I0319 10:25:21.579043 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/1.log" Mar 19 10:25:22 crc kubenswrapper[4765]: I0319 10:25:22.356022 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:22 crc kubenswrapper[4765]: E0319 10:25:22.359042 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:22 crc kubenswrapper[4765]: E0319 10:25:22.502378 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.244785 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.245072 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:27:25.245038999 +0000 UTC m=+343.593984551 (durationBeforeRetry 2m2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.245251 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.245511 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.245675 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:27:25.245637615 +0000 UTC m=+343.594583197 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.346145 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.346252 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.346299 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346518 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346585 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346526 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346726 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:27:25.346689819 +0000 UTC m=+343.695635401 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346523 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346790 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346819 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346607 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.346907 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:27:25.346879004 +0000 UTC m=+343.695824586 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.347015 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:27:25.346948916 +0000 UTC m=+343.695894498 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.355920 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.356041 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:23 crc kubenswrapper[4765]: I0319 10:25:23.355926 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.356214 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.356370 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:23 crc kubenswrapper[4765]: E0319 10:25:23.356572 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:24 crc kubenswrapper[4765]: I0319 10:25:24.355599 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:24 crc kubenswrapper[4765]: E0319 10:25:24.355780 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:25 crc kubenswrapper[4765]: I0319 10:25:25.355934 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:25 crc kubenswrapper[4765]: I0319 10:25:25.356120 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:25 crc kubenswrapper[4765]: E0319 10:25:25.356192 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:25 crc kubenswrapper[4765]: E0319 10:25:25.356338 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:25 crc kubenswrapper[4765]: I0319 10:25:25.356133 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:25 crc kubenswrapper[4765]: E0319 10:25:25.356627 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:26 crc kubenswrapper[4765]: I0319 10:25:26.356071 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:26 crc kubenswrapper[4765]: E0319 10:25:26.356324 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:27 crc kubenswrapper[4765]: I0319 10:25:27.355509 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:27 crc kubenswrapper[4765]: I0319 10:25:27.355510 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:27 crc kubenswrapper[4765]: E0319 10:25:27.356066 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:27 crc kubenswrapper[4765]: I0319 10:25:27.355585 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:27 crc kubenswrapper[4765]: E0319 10:25:27.356265 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:27 crc kubenswrapper[4765]: E0319 10:25:27.356419 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:27 crc kubenswrapper[4765]: E0319 10:25:27.504154 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:25:28 crc kubenswrapper[4765]: I0319 10:25:28.362301 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:28 crc kubenswrapper[4765]: E0319 10:25:28.362547 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:28 crc kubenswrapper[4765]: I0319 10:25:28.363660 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:25:28 crc kubenswrapper[4765]: I0319 10:25:28.611840 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/3.log" Mar 19 10:25:28 crc kubenswrapper[4765]: I0319 10:25:28.615433 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerStarted","Data":"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a"} Mar 19 10:25:28 crc kubenswrapper[4765]: I0319 10:25:28.615946 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:25:28 crc kubenswrapper[4765]: I0319 10:25:28.647825 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podStartSLOduration=156.647802252 podStartE2EDuration="2m36.647802252s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:28.64442294 +0000 UTC m=+226.993368492" watchObservedRunningTime="2026-03-19 10:25:28.647802252 +0000 UTC m=+226.996747794" Mar 19 10:25:29 crc kubenswrapper[4765]: I0319 10:25:29.286534 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t8k4k"] Mar 19 10:25:29 crc kubenswrapper[4765]: I0319 10:25:29.286713 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:29 crc kubenswrapper[4765]: E0319 10:25:29.286836 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:29 crc kubenswrapper[4765]: I0319 10:25:29.355771 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:29 crc kubenswrapper[4765]: I0319 10:25:29.355872 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:29 crc kubenswrapper[4765]: E0319 10:25:29.356063 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:29 crc kubenswrapper[4765]: E0319 10:25:29.356196 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:30 crc kubenswrapper[4765]: I0319 10:25:30.355990 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:30 crc kubenswrapper[4765]: E0319 10:25:30.356196 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:31 crc kubenswrapper[4765]: I0319 10:25:31.356042 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:31 crc kubenswrapper[4765]: I0319 10:25:31.356072 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:31 crc kubenswrapper[4765]: E0319 10:25:31.356799 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:31 crc kubenswrapper[4765]: I0319 10:25:31.356102 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:31 crc kubenswrapper[4765]: E0319 10:25:31.356895 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:31 crc kubenswrapper[4765]: E0319 10:25:31.357037 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:32 crc kubenswrapper[4765]: I0319 10:25:32.355329 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:32 crc kubenswrapper[4765]: E0319 10:25:32.356424 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:32 crc kubenswrapper[4765]: E0319 10:25:32.505371 4765 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 19 10:25:33 crc kubenswrapper[4765]: I0319 10:25:33.356196 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:33 crc kubenswrapper[4765]: I0319 10:25:33.356244 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:33 crc kubenswrapper[4765]: I0319 10:25:33.356315 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:33 crc kubenswrapper[4765]: E0319 10:25:33.357121 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:33 crc kubenswrapper[4765]: E0319 10:25:33.357267 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:33 crc kubenswrapper[4765]: E0319 10:25:33.357160 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:34 crc kubenswrapper[4765]: I0319 10:25:34.356005 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:34 crc kubenswrapper[4765]: E0319 10:25:34.356449 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:34 crc kubenswrapper[4765]: I0319 10:25:34.356997 4765 scope.go:117] "RemoveContainer" containerID="c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2" Mar 19 10:25:34 crc kubenswrapper[4765]: I0319 10:25:34.640735 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/1.log" Mar 19 10:25:34 crc kubenswrapper[4765]: I0319 10:25:34.640818 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerStarted","Data":"e5956a936881882c7602c0fc752793d8c139ba7cbb4a3a5cd015552febec3d5b"} Mar 19 10:25:35 crc kubenswrapper[4765]: I0319 10:25:35.355572 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:35 crc kubenswrapper[4765]: I0319 10:25:35.355684 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:35 crc kubenswrapper[4765]: E0319 10:25:35.355796 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:35 crc kubenswrapper[4765]: I0319 10:25:35.355700 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:35 crc kubenswrapper[4765]: E0319 10:25:35.355886 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:35 crc kubenswrapper[4765]: E0319 10:25:35.356182 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:36 crc kubenswrapper[4765]: I0319 10:25:36.356310 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:36 crc kubenswrapper[4765]: E0319 10:25:36.356587 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:25:37 crc kubenswrapper[4765]: I0319 10:25:37.355881 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:37 crc kubenswrapper[4765]: I0319 10:25:37.355936 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:37 crc kubenswrapper[4765]: I0319 10:25:37.355936 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:37 crc kubenswrapper[4765]: E0319 10:25:37.356049 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:25:37 crc kubenswrapper[4765]: E0319 10:25:37.356231 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-t8k4k" podUID="ab39cf0a-a301-484b-9328-19acff8edae9" Mar 19 10:25:37 crc kubenswrapper[4765]: E0319 10:25:37.356296 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:25:38 crc kubenswrapper[4765]: I0319 10:25:38.355774 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:25:38 crc kubenswrapper[4765]: I0319 10:25:38.359286 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 10:25:38 crc kubenswrapper[4765]: I0319 10:25:38.361538 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 10:25:39 crc kubenswrapper[4765]: I0319 10:25:39.356264 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:39 crc kubenswrapper[4765]: I0319 10:25:39.356306 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:25:39 crc kubenswrapper[4765]: I0319 10:25:39.356332 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:25:39 crc kubenswrapper[4765]: I0319 10:25:39.359441 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 10:25:39 crc kubenswrapper[4765]: I0319 10:25:39.360481 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 10:25:39 crc kubenswrapper[4765]: I0319 10:25:39.360875 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 10:25:39 crc kubenswrapper[4765]: I0319 10:25:39.361259 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.596401 4765 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.634855 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7hqvg"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.635458 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.638414 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.638930 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.645570 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z85bn"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.646098 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.648158 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.648336 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.648478 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.648577 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.648675 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.648820 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.648945 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.649121 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.649481 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.649543 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.649757 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.649892 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.649919 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.650028 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.656398 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.656530 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.656524 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.656397 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.656538 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.656562 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.656731 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.662586 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.672673 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.673883 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.676019 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.682349 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r9spg"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.700324 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.704368 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xdm6r"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.705642 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.706088 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.706433 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.706498 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.706645 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.706795 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.707578 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-s58pr"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.708055 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.708246 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.708601 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.708751 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.708899 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.710438 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.710476 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.710686 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.710856 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.711174 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.711326 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.711507 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.711638 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.711795 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.712295 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.712427 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.712743 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.714770 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.715316 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8sk6m"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.715722 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.716518 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.716735 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.720145 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.722462 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-56zlb"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.723255 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-94dnk"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.723713 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.724183 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.725398 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.726157 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.726280 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.726376 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.726495 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.726608 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.726729 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.728360 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.728804 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qfvdv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.729536 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.729890 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.730115 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.730442 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.731174 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.731341 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.731366 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.731924 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.732234 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.732366 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.732501 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.732675 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.732835 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.733224 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.733472 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.733749 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.733781 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.734292 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.735308 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.735574 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.735810 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.735862 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.736024 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.736862 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.737070 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.737525 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.737754 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.737935 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.738310 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.757115 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.757215 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.757443 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.758259 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.758645 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.765719 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-jnmk9"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.766942 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jnmk9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.771669 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.774546 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x94pq"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.775201 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.774632 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.779328 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.779827 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.781751 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.782111 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.782335 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.782587 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.782648 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.804305 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.804764 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.804854 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.804911 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.805019 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.805074 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.805127 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.805182 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806719 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-client-ca\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806768 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806793 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/773bc628-94fe-43c5-8247-48c8d510df6a-kube-api-access-7828p\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806811 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df32ebd2-1bfc-4da0-959a-abc479034b0b-serving-cert\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806827 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806844 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-trusted-ca-bundle\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806857 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-image-import-ca\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.806872 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/396daff4-daaf-43de-8794-5076381c0d47-audit-dir\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.809909 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.810015 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.810259 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.810931 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.811729 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.812617 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f5edb0-c000-4e80-b27d-d0d6023510f8-serving-cert\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.812684 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3bde734c-df56-471b-8a70-2f555a974e57-audit-dir\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.812719 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-serving-cert\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.812748 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-57drz\" (UniqueName: \"kubernetes.io/projected/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-kube-api-access-57drz\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813381 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-client-ca\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813417 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813473 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-config\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813500 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/993cdcd1-8323-49aa-b587-5a8c344a2077-images\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813558 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813587 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-service-ca\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813613 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-etcd-client\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813638 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-auth-proxy-config\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813662 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flkks\" (UniqueName: \"kubernetes.io/projected/3bde734c-df56-471b-8a70-2f555a974e57-kube-api-access-flkks\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813717 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-trusted-ca-bundle\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813773 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/993cdcd1-8323-49aa-b587-5a8c344a2077-config\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813798 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813823 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4s7q\" (UniqueName: \"kubernetes.io/projected/b0fd12d6-f32c-4f69-a285-8f837e745910-kube-api-access-m4s7q\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813844 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-config\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813859 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813870 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813897 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-etcd-client\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813923 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/773bc628-94fe-43c5-8247-48c8d510df6a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813945 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-encryption-config\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.813984 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814009 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd7kn\" (UniqueName: \"kubernetes.io/projected/396daff4-daaf-43de-8794-5076381c0d47-kube-api-access-fd7kn\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814032 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814064 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-oauth-config\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814085 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-oauth-serving-cert\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814111 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/993cdcd1-8323-49aa-b587-5a8c344a2077-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814135 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814188 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-config\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814217 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-service-ca\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814245 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-config\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814265 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-audit-policies\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.814294 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.815654 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818402 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-trusted-ca\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818490 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818546 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-serving-cert\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818574 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-serving-cert\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818592 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-machine-approver-tls\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818631 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-ca\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818652 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/401e164a-fc29-412f-ab6e-1c911f6c2d0a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zhnjs\" (UID: \"401e164a-fc29-412f-ab6e-1c911f6c2d0a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818718 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-config\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818779 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818810 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-service-ca-bundle\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818829 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-serving-cert\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818865 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z5fxc\" (UniqueName: \"kubernetes.io/projected/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-kube-api-access-z5fxc\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818891 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-config\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818930 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fd12d6-f32c-4f69-a285-8f837e745910-serving-cert\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.818953 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tlhp\" (UniqueName: \"kubernetes.io/projected/df32ebd2-1bfc-4da0-959a-abc479034b0b-kube-api-access-9tlhp\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819003 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819049 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bnsnc\" (UniqueName: \"kubernetes.io/projected/401e164a-fc29-412f-ab6e-1c911f6c2d0a-kube-api-access-bnsnc\") pod \"cluster-samples-operator-665b6dd947-zhnjs\" (UID: \"401e164a-fc29-412f-ab6e-1c911f6c2d0a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819160 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-serving-cert\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819207 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-policies\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819254 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819325 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-client\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819358 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819416 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9kvj\" (UniqueName: \"kubernetes.io/projected/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-kube-api-access-w9kvj\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819447 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3bde734c-df56-471b-8a70-2f555a974e57-node-pullsecrets\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819474 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-audit\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819507 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nkbcm\" (UniqueName: \"kubernetes.io/projected/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-kube-api-access-nkbcm\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819583 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wk4xl\" (UniqueName: \"kubernetes.io/projected/39658af6-59cf-48c7-9015-2271021bd64e-kube-api-access-wk4xl\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819667 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-encryption-config\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819836 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wz75\" (UniqueName: \"kubernetes.io/projected/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-kube-api-access-6wz75\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819867 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7k82\" (UniqueName: \"kubernetes.io/projected/993cdcd1-8323-49aa-b587-5a8c344a2077-kube-api-access-v7k82\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819891 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819923 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/773bc628-94fe-43c5-8247-48c8d510df6a-serving-cert\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.819945 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-console-config\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.823017 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.823187 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.823035 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.823993 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.824180 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.824290 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.824639 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.831267 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.833181 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.834114 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-config\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.834166 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tpwq\" (UniqueName: \"kubernetes.io/projected/72f5edb0-c000-4e80-b27d-d0d6023510f8-kube-api-access-6tpwq\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.834193 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-etcd-serving-ca\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.837784 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.838127 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-dir\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.838784 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.840791 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.843355 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7hqvg"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.843387 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.843830 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.843861 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.845822 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.847721 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.847830 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.850997 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.851724 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.853302 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.854000 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.854928 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpdr"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.855302 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sw5kt"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.855349 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.856291 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.857694 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.863850 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.864565 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.866734 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h7csr"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.867982 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.869142 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.871160 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.871774 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.876334 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.877872 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.878262 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.879339 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.879690 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.883423 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-bj5n6"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.885364 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.886495 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.890720 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.894332 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.894940 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.896764 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.897369 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.900739 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.901361 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.902814 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565264-nvg5v"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.903821 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.905657 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.907056 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.908277 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-s58pr"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.909029 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xdm6r"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.910335 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.911462 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.912681 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-524cv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.913632 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-tk8m5"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.913664 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.915011 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.916299 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-56zlb"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.917671 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.917885 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z85bn"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.918608 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.919926 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.920753 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.921821 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h7csr"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.922895 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jnmk9"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.924233 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-94dnk"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.925684 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8sk6m"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.926946 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r9spg"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.928048 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.929124 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.930211 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x94pq"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.931248 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.932230 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-nvg5v"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.933425 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sw5kt"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.934535 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.935689 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.936904 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qfvdv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938116 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938235 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpdr"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938732 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-trusted-ca\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938767 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-serving-cert\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938830 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-machine-approver-tls\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938865 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-serving-cert\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938893 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/401e164a-fc29-412f-ab6e-1c911f6c2d0a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zhnjs\" (UID: \"401e164a-fc29-412f-ab6e-1c911f6c2d0a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938918 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-config\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.938949 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-ca\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939003 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-service-ca-bundle\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939039 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-serving-cert\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939071 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z5fxc\" (UniqueName: \"kubernetes.io/projected/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-kube-api-access-z5fxc\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939112 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f418b4ca-da05-4139-b8d9-5614419e936b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939150 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939189 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fd12d6-f32c-4f69-a285-8f837e745910-serving-cert\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939214 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tlhp\" (UniqueName: \"kubernetes.io/projected/df32ebd2-1bfc-4da0-959a-abc479034b0b-kube-api-access-9tlhp\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939238 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-config\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939260 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bnsnc\" (UniqueName: \"kubernetes.io/projected/401e164a-fc29-412f-ab6e-1c911f6c2d0a-kube-api-access-bnsnc\") pod \"cluster-samples-operator-665b6dd947-zhnjs\" (UID: \"401e164a-fc29-412f-ab6e-1c911f6c2d0a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939280 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-serving-cert\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939302 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-policies\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939324 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939348 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939410 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-client\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939437 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939462 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3bde734c-df56-471b-8a70-2f555a974e57-node-pullsecrets\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939485 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-audit\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939506 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nkbcm\" (UniqueName: \"kubernetes.io/projected/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-kube-api-access-nkbcm\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939537 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9kvj\" (UniqueName: \"kubernetes.io/projected/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-kube-api-access-w9kvj\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939558 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wk4xl\" (UniqueName: \"kubernetes.io/projected/39658af6-59cf-48c7-9015-2271021bd64e-kube-api-access-wk4xl\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939581 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-encryption-config\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939613 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wz75\" (UniqueName: \"kubernetes.io/projected/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-kube-api-access-6wz75\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939635 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7k82\" (UniqueName: \"kubernetes.io/projected/993cdcd1-8323-49aa-b587-5a8c344a2077-kube-api-access-v7k82\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939654 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939665 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939677 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-console-config\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939753 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.940477 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.939795 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/773bc628-94fe-43c5-8247-48c8d510df6a-serving-cert\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.940992 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-config\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941003 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-console-config\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941057 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tpwq\" (UniqueName: \"kubernetes.io/projected/72f5edb0-c000-4e80-b27d-d0d6023510f8-kube-api-access-6tpwq\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941119 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f418b4ca-da05-4139-b8d9-5614419e936b-config\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941159 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-etcd-serving-ca\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941290 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-dir\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941307 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941355 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-client-ca\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941388 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941443 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/773bc628-94fe-43c5-8247-48c8d510df6a-kube-api-access-7828p\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941474 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df32ebd2-1bfc-4da0-959a-abc479034b0b-serving-cert\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941478 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-policies\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941549 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-trusted-ca-bundle\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941639 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941579 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-image-import-ca\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941680 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/396daff4-daaf-43de-8794-5076381c0d47-audit-dir\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941700 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f5edb0-c000-4e80-b27d-d0d6023510f8-serving-cert\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941728 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941748 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3bde734c-df56-471b-8a70-2f555a974e57-audit-dir\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941767 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-serving-cert\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941787 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-57drz\" (UniqueName: \"kubernetes.io/projected/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-kube-api-access-57drz\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941813 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-client-ca\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941833 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-config\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941851 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941881 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/993cdcd1-8323-49aa-b587-5a8c344a2077-images\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941909 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-service-ca\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941950 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-etcd-client\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.941985 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-auth-proxy-config\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942006 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flkks\" (UniqueName: \"kubernetes.io/projected/3bde734c-df56-471b-8a70-2f555a974e57-kube-api-access-flkks\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-trusted-ca-bundle\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942041 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/993cdcd1-8323-49aa-b587-5a8c344a2077-config\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942059 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942082 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m4s7q\" (UniqueName: \"kubernetes.io/projected/b0fd12d6-f32c-4f69-a285-8f837e745910-kube-api-access-m4s7q\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942098 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-config\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942116 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942133 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-etcd-client\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942161 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942191 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd7kn\" (UniqueName: \"kubernetes.io/projected/396daff4-daaf-43de-8794-5076381c0d47-kube-api-access-fd7kn\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942219 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942244 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/773bc628-94fe-43c5-8247-48c8d510df6a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942267 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-encryption-config\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942291 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-oauth-config\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942312 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-oauth-serving-cert\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942338 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/993cdcd1-8323-49aa-b587-5a8c344a2077-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942359 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942381 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f418b4ca-da05-4139-b8d9-5614419e936b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942408 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-service-ca\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942432 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-config\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942452 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-config\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942480 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-audit-policies\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942502 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.942723 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-config\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.943566 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.943615 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-image-import-ca\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.943943 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-ca\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.944091 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/396daff4-daaf-43de-8794-5076381c0d47-audit-dir\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.945153 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-service-ca-bundle\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.945476 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-config\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.945700 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-trusted-ca\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.945936 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-config\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.946142 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.946890 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-config\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.947530 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-serving-cert\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.947441 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-dir\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.948476 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-oauth-serving-cert\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.948601 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-serving-cert\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.948671 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-client-ca\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.949054 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.949140 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.949455 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-etcd-serving-ca\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.949530 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/773bc628-94fe-43c5-8247-48c8d510df6a-available-featuregates\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.949654 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.949995 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-machine-approver-tls\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.950084 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/3bde734c-df56-471b-8a70-2f555a974e57-audit-dir\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.950383 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-serving-cert\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.950521 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-audit\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.950578 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/3bde734c-df56-471b-8a70-2f555a974e57-node-pullsecrets\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.950606 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-serving-cert\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.950819 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.951199 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-etcd-client\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.951277 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-service-ca\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.951512 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/df32ebd2-1bfc-4da0-959a-abc479034b0b-etcd-client\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.951862 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b0fd12d6-f32c-4f69-a285-8f837e745910-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.952280 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-client-ca\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.952380 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.952393 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f5edb0-c000-4e80-b27d-d0d6023510f8-serving-cert\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.952809 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-encryption-config\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.952908 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-trusted-ca-bundle\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.953132 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3bde734c-df56-471b-8a70-2f555a974e57-config\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.953180 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-config\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.953272 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/773bc628-94fe-43c5-8247-48c8d510df6a-serving-cert\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.953390 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/3bde734c-df56-471b-8a70-2f555a974e57-encryption-config\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.953434 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fl4n5"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.953499 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-serving-cert\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.953859 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.954210 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-service-ca\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.954274 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-auth-proxy-config\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.954739 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-x9qcs"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955172 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-config\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955192 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955451 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/396daff4-daaf-43de-8794-5076381c0d47-audit-policies\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955534 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955611 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955724 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955871 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.955921 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-oauth-config\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.957396 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-trusted-ca-bundle\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.957491 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.957988 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/993cdcd1-8323-49aa-b587-5a8c344a2077-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.958430 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df32ebd2-1bfc-4da0-959a-abc479034b0b-serving-cert\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.958670 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/401e164a-fc29-412f-ab6e-1c911f6c2d0a-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zhnjs\" (UID: \"401e164a-fc29-412f-ab6e-1c911f6c2d0a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.958784 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b0fd12d6-f32c-4f69-a285-8f837e745910-serving-cert\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.960682 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/993cdcd1-8323-49aa-b587-5a8c344a2077-config\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.960732 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.961217 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/993cdcd1-8323-49aa-b587-5a8c344a2077-images\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.961571 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.962197 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.966229 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.966275 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.967652 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/396daff4-daaf-43de-8794-5076381c0d47-etcd-client\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.967681 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.968166 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-524cv"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.969315 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.970470 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.972063 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tk8m5"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.973501 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fl4n5"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.974877 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.976183 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q"] Mar 19 10:25:43 crc kubenswrapper[4765]: I0319 10:25:43.978435 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.000299 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.018439 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.043279 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f418b4ca-da05-4139-b8d9-5614419e936b-config\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.043513 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f418b4ca-da05-4139-b8d9-5614419e936b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.043584 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f418b4ca-da05-4139-b8d9-5614419e936b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.058277 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.078482 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.098654 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.118730 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.139440 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.158173 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.170484 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f418b4ca-da05-4139-b8d9-5614419e936b-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.178918 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.184568 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f418b4ca-da05-4139-b8d9-5614419e936b-config\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.198253 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.217531 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.258661 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.278030 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.298152 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.319161 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.337720 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.357381 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.386792 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.397789 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.418657 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.438088 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.459319 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.478589 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.497568 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.518198 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.539286 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.558416 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.578919 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.599495 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.618824 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.638502 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.674060 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.677762 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.697854 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.718600 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.737662 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.758466 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.778694 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.798353 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.818418 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.838266 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.858223 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.876501 4765 request.go:700] Waited for 1.005047701s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-api/secrets?fieldSelector=metadata.name%3Dcontrol-plane-machine-set-operator-dockercfg-k9rxt&limit=500&resourceVersion=0 Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.878484 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.897903 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.918937 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.938725 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.959021 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 10:25:44 crc kubenswrapper[4765]: I0319 10:25:44.979055 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.000856 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.019331 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.038597 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.059592 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.078596 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.098193 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.118978 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.138576 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.157729 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.178799 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.199327 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.219494 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.237764 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.258419 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.278128 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.299085 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.319127 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.338647 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.359096 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.378983 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.397983 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.418589 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.438417 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.459274 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.479673 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.499179 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.519543 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.538826 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.559031 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.578744 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.598100 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.619830 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.668033 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m4s7q\" (UniqueName: \"kubernetes.io/projected/b0fd12d6-f32c-4f69-a285-8f837e745910-kube-api-access-m4s7q\") pod \"authentication-operator-69f744f599-z85bn\" (UID: \"b0fd12d6-f32c-4f69-a285-8f837e745910\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.702470 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tlhp\" (UniqueName: \"kubernetes.io/projected/df32ebd2-1bfc-4da0-959a-abc479034b0b-kube-api-access-9tlhp\") pod \"etcd-operator-b45778765-s58pr\" (UID: \"df32ebd2-1bfc-4da0-959a-abc479034b0b\") " pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.702470 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z5fxc\" (UniqueName: \"kubernetes.io/projected/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-kube-api-access-z5fxc\") pod \"oauth-openshift-558db77b4-8sk6m\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.717429 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bnsnc\" (UniqueName: \"kubernetes.io/projected/401e164a-fc29-412f-ab6e-1c911f6c2d0a-kube-api-access-bnsnc\") pod \"cluster-samples-operator-665b6dd947-zhnjs\" (UID: \"401e164a-fc29-412f-ab6e-1c911f6c2d0a\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.739012 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tpwq\" (UniqueName: \"kubernetes.io/projected/72f5edb0-c000-4e80-b27d-d0d6023510f8-kube-api-access-6tpwq\") pod \"controller-manager-879f6c89f-r9spg\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.754507 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9kvj\" (UniqueName: \"kubernetes.io/projected/93cd26ea-e56f-4bb3-9bae-1e9b552480d8-kube-api-access-w9kvj\") pod \"openshift-controller-manager-operator-756b6f6bc6-c96ts\" (UID: \"93cd26ea-e56f-4bb3-9bae-1e9b552480d8\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.774113 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wk4xl\" (UniqueName: \"kubernetes.io/projected/39658af6-59cf-48c7-9015-2271021bd64e-kube-api-access-wk4xl\") pod \"console-f9d7485db-94dnk\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.799157 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd7kn\" (UniqueName: \"kubernetes.io/projected/396daff4-daaf-43de-8794-5076381c0d47-kube-api-access-fd7kn\") pod \"apiserver-7bbb656c7d-4tmv9\" (UID: \"396daff4-daaf-43de-8794-5076381c0d47\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.818032 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7828p\" (UniqueName: \"kubernetes.io/projected/773bc628-94fe-43c5-8247-48c8d510df6a-kube-api-access-7828p\") pod \"openshift-config-operator-7777fb866f-2gjzh\" (UID: \"773bc628-94fe-43c5-8247-48c8d510df6a\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.821256 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.837629 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7k82\" (UniqueName: \"kubernetes.io/projected/993cdcd1-8323-49aa-b587-5a8c344a2077-kube-api-access-v7k82\") pod \"machine-api-operator-5694c8668f-7hqvg\" (UID: \"993cdcd1-8323-49aa-b587-5a8c344a2077\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.860109 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wz75\" (UniqueName: \"kubernetes.io/projected/a55008bb-1e97-4a50-9fa7-6a43c7edbc29-kube-api-access-6wz75\") pod \"machine-approver-56656f9798-vc8ns\" (UID: \"a55008bb-1e97-4a50-9fa7-6a43c7edbc29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.866315 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.881558 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nkbcm\" (UniqueName: \"kubernetes.io/projected/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-kube-api-access-nkbcm\") pod \"route-controller-manager-6576b87f9c-ptr8w\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.890139 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.896505 4765 request.go:700] Waited for 1.940961228s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-machine-config-operator/secrets?fieldSelector=metadata.name%3Dmachine-config-server-dockercfg-qx5rd&limit=500&resourceVersion=0 Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.899668 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.900661 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-57drz\" (UniqueName: \"kubernetes.io/projected/ea55ad1c-3f3c-418d-aed8-915b494eb6fa-kube-api-access-57drz\") pod \"console-operator-58897d9998-xdm6r\" (UID: \"ea55ad1c-3f3c-418d-aed8-915b494eb6fa\") " pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.914642 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.918415 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.940222 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.940427 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 10:25:45 crc kubenswrapper[4765]: W0319 10:25:45.951601 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda55008bb_1e97_4a50_9fa7_6a43c7edbc29.slice/crio-073133ef80d614e3dd5c8e28a5eddf19746256761267c32f4f96d400117ef926 WatchSource:0}: Error finding container 073133ef80d614e3dd5c8e28a5eddf19746256761267c32f4f96d400117ef926: Status 404 returned error can't find the container with id 073133ef80d614e3dd5c8e28a5eddf19746256761267c32f4f96d400117ef926 Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.954851 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.958642 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.965149 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.970096 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.989004 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:45 crc kubenswrapper[4765]: I0319 10:25:45.989166 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:45.998068 4765 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:45.999564 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flkks\" (UniqueName: \"kubernetes.io/projected/3bde734c-df56-471b-8a70-2f555a974e57-kube-api-access-flkks\") pod \"apiserver-76f77b778f-56zlb\" (UID: \"3bde734c-df56-471b-8a70-2f555a974e57\") " pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.008627 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.010332 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.018714 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.023418 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.060181 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f418b4ca-da05-4139-b8d9-5614419e936b-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-xhwkd\" (UID: \"f418b4ca-da05-4139-b8d9-5614419e936b\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.061191 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.098682 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-z85bn"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.103523 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.121897 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179075 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-certificates\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179120 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24b9c7be-22c4-4959-9332-d06229dd3371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179172 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/900ba5b7-85f6-4924-af0b-61efc2c8598a-metrics-tls\") pod \"dns-operator-744455d44c-qfvdv\" (UID: \"900ba5b7-85f6-4924-af0b-61efc2c8598a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179214 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54a73adb-452b-4db3-9bc2-3411d1575eb5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179264 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djbdm\" (UniqueName: \"kubernetes.io/projected/19a81c88-752a-4a04-a1f7-70cb357f7be1-kube-api-access-djbdm\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179293 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f2txw\" (UniqueName: \"kubernetes.io/projected/926402ae-efb3-46fa-b415-8333a236c36a-kube-api-access-f2txw\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179333 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357560a9-5851-42f2-b627-a41d831d7f27-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179437 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gkhbf\" (UniqueName: \"kubernetes.io/projected/900ba5b7-85f6-4924-af0b-61efc2c8598a-kube-api-access-gkhbf\") pod \"dns-operator-744455d44c-qfvdv\" (UID: \"900ba5b7-85f6-4924-af0b-61efc2c8598a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179453 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19a81c88-752a-4a04-a1f7-70cb357f7be1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179468 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24b9c7be-22c4-4959-9332-d06229dd3371-metrics-tls\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179494 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357560a9-5851-42f2-b627-a41d831d7f27-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179518 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fdxhv\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-kube-api-access-fdxhv\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179548 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/19a81c88-752a-4a04-a1f7-70cb357f7be1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179568 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19a81c88-752a-4a04-a1f7-70cb357f7be1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179613 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-bound-sa-token\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179631 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54a73adb-452b-4db3-9bc2-3411d1575eb5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179646 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/926402ae-efb3-46fa-b415-8333a236c36a-proxy-tls\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179663 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-trusted-ca\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179758 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8854m\" (UniqueName: \"kubernetes.io/projected/4980eaf1-2428-41fe-8a4f-052aace46947-kube-api-access-8854m\") pod \"downloads-7954f5f757-jnmk9\" (UID: \"4980eaf1-2428-41fe-8a4f-052aace46947\") " pod="openshift-console/downloads-7954f5f757-jnmk9" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179830 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-tls\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179849 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/926402ae-efb3-46fa-b415-8333a236c36a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179865 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24b9c7be-22c4-4959-9332-d06229dd3371-trusted-ca\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179881 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vtzvz\" (UniqueName: \"kubernetes.io/projected/24b9c7be-22c4-4959-9332-d06229dd3371-kube-api-access-vtzvz\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.179907 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2x8h\" (UniqueName: \"kubernetes.io/projected/357560a9-5851-42f2-b627-a41d831d7f27-kube-api-access-t2x8h\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.181813 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:46.68179556 +0000 UTC m=+245.030741102 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281100 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.281270 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:46.78123525 +0000 UTC m=+245.130180792 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281344 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54a73adb-452b-4db3-9bc2-3411d1575eb5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281381 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/926402ae-efb3-46fa-b415-8333a236c36a-proxy-tls\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281412 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c7ffece-fbab-4d6c-a327-f0402649a29e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281620 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-plugins-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281677 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6536bc9-dc65-4acd-975f-87f5621fb0f3-signing-key\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281699 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-stats-auth\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281728 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kp96v\" (UniqueName: \"kubernetes.io/projected/736dd712-8f96-4b9c-bf2b-2f3eb3d4a604-kube-api-access-kp96v\") pod \"multus-admission-controller-857f4d67dd-sw5kt\" (UID: \"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281773 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93df02a3-8614-47ff-a1ed-9592ef47d84e-metrics-tls\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281874 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54a73adb-452b-4db3-9bc2-3411d1575eb5-ca-trust-extracted\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281915 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-tls\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.281994 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ea5c445-7213-449a-84b3-94ef0ddad18e-webhook-cert\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282082 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6536bc9-dc65-4acd-975f-87f5621fb0f3-signing-cabundle\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282136 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/926402ae-efb3-46fa-b415-8333a236c36a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282161 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bb55d3af-2526-425e-8a0a-25b779589866-profile-collector-cert\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282187 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q6hhf\" (UniqueName: \"kubernetes.io/projected/dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0-kube-api-access-q6hhf\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdqns\" (UID: \"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282214 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2x8h\" (UniqueName: \"kubernetes.io/projected/357560a9-5851-42f2-b627-a41d831d7f27-kube-api-access-t2x8h\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282273 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k979c\" (UniqueName: \"kubernetes.io/projected/761e2822-68b0-4ea8-ada6-80f8ce6dec21-kube-api-access-k979c\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282355 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-socket-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.282861 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-registration-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283028 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/900ba5b7-85f6-4924-af0b-61efc2c8598a-metrics-tls\") pod \"dns-operator-744455d44c-qfvdv\" (UID: \"900ba5b7-85f6-4924-af0b-61efc2c8598a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283568 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/761e2822-68b0-4ea8-ada6-80f8ce6dec21-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283600 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v628j\" (UniqueName: \"kubernetes.io/projected/d566ef89-eb28-44ae-86ea-60a5a91803b0-kube-api-access-v628j\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283632 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rv285\" (UniqueName: \"kubernetes.io/projected/fba67808-dc6d-4f9e-bd53-9185baa79d78-kube-api-access-rv285\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283652 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba67808-dc6d-4f9e-bd53-9185baa79d78-service-ca-bundle\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283644 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/926402ae-efb3-46fa-b415-8333a236c36a-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283673 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f2txw\" (UniqueName: \"kubernetes.io/projected/926402ae-efb3-46fa-b415-8333a236c36a-kube-api-access-f2txw\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283745 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63dd2ad3-2637-4d5d-99b8-255a37205e36-config\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283784 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357560a9-5851-42f2-b627-a41d831d7f27-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283810 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63dd2ad3-2637-4d5d-99b8-255a37205e36-serving-cert\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283833 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b972d9ca-6117-4fd7-b488-8c4808f069d4-cert\") pod \"ingress-canary-524cv\" (UID: \"b972d9ca-6117-4fd7-b488-8c4808f069d4\") " pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283855 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t429x\" (UniqueName: \"kubernetes.io/projected/63dd2ad3-2637-4d5d-99b8-255a37205e36-kube-api-access-t429x\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283875 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bb55d3af-2526-425e-8a0a-25b779589866-srv-cert\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283917 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gkhbf\" (UniqueName: \"kubernetes.io/projected/900ba5b7-85f6-4924-af0b-61efc2c8598a-kube-api-access-gkhbf\") pod \"dns-operator-744455d44c-qfvdv\" (UID: \"900ba5b7-85f6-4924-af0b-61efc2c8598a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283938 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24b9c7be-22c4-4959-9332-d06229dd3371-metrics-tls\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.283980 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffece-fbab-4d6c-a327-f0402649a29e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284007 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b4e46d6-a655-4649-9656-7c45ea94b38f-srv-cert\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284038 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357560a9-5851-42f2-b627-a41d831d7f27-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284061 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8vq9z\" (UniqueName: \"kubernetes.io/projected/bb55d3af-2526-425e-8a0a-25b779589866-kube-api-access-8vq9z\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284081 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fdxhv\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-kube-api-access-fdxhv\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284101 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-metrics-certs\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284136 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc39b56a-2b78-4c24-9f99-e1357d76b391-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284157 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-csi-data-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284175 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-secret-volume\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284194 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/736dd712-8f96-4b9c-bf2b-2f3eb3d4a604-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sw5kt\" (UID: \"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284241 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/761e2822-68b0-4ea8-ada6-80f8ce6dec21-proxy-tls\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284264 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-trusted-ca\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284353 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4ea5c445-7213-449a-84b3-94ef0ddad18e-tmpfs\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284385 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284405 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5xp6\" (UniqueName: \"kubernetes.io/projected/3a1f010c-6208-44bd-b36d-95140eaa0cd7-kube-api-access-f5xp6\") pod \"package-server-manager-789f6589d5-7xc9q\" (UID: \"3a1f010c-6208-44bd-b36d-95140eaa0cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284431 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbh8j\" (UniqueName: \"kubernetes.io/projected/1b4e46d6-a655-4649-9656-7c45ea94b38f-kube-api-access-dbh8j\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284458 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8854m\" (UniqueName: \"kubernetes.io/projected/4980eaf1-2428-41fe-8a4f-052aace46947-kube-api-access-8854m\") pod \"downloads-7954f5f757-jnmk9\" (UID: \"4980eaf1-2428-41fe-8a4f-052aace46947\") " pod="openshift-console/downloads-7954f5f757-jnmk9" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284479 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-default-certificate\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284500 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/761e2822-68b0-4ea8-ada6-80f8ce6dec21-images\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284534 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284561 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc39b56a-2b78-4c24-9f99-e1357d76b391-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284598 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24b9c7be-22c4-4959-9332-d06229dd3371-trusted-ca\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284620 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vtzvz\" (UniqueName: \"kubernetes.io/projected/24b9c7be-22c4-4959-9332-d06229dd3371-kube-api-access-vtzvz\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284641 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dk4wh\" (UniqueName: \"kubernetes.io/projected/4ea5c445-7213-449a-84b3-94ef0ddad18e-kube-api-access-dk4wh\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284689 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2mcd\" (UniqueName: \"kubernetes.io/projected/23d53283-d76c-40c3-804e-73fc3431ed98-kube-api-access-l2mcd\") pod \"migrator-59844c95c7-5bjx5\" (UID: \"23d53283-d76c-40c3-804e-73fc3431ed98\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.284709 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b2566-55ba-4df7-be75-afdc03f5ea73-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285223 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d566ef89-eb28-44ae-86ea-60a5a91803b0-certs\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285254 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7vpc\" (UniqueName: \"kubernetes.io/projected/b972d9ca-6117-4fd7-b488-8c4808f069d4-kube-api-access-s7vpc\") pod \"ingress-canary-524cv\" (UID: \"b972d9ca-6117-4fd7-b488-8c4808f069d4\") " pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285281 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93df02a3-8614-47ff-a1ed-9592ef47d84e-config-volume\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285302 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjmk6\" (UniqueName: \"kubernetes.io/projected/93df02a3-8614-47ff-a1ed-9592ef47d84e-kube-api-access-gjmk6\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285338 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8q5v\" (UniqueName: \"kubernetes.io/projected/c5495eef-efca-4df2-81bb-bd93bb2f8a38-kube-api-access-g8q5v\") pod \"auto-csr-approver-29565264-nvg5v\" (UID: \"c5495eef-efca-4df2-81bb-bd93bb2f8a38\") " pod="openshift-infra/auto-csr-approver-29565264-nvg5v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285361 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-certificates\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285380 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24b9c7be-22c4-4959-9332-d06229dd3371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285402 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mcpb\" (UniqueName: \"kubernetes.io/projected/f27f5c72-19c7-4d66-b927-0eae532ff4fe-kube-api-access-4mcpb\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285450 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54a73adb-452b-4db3-9bc2-3411d1575eb5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285472 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d566ef89-eb28-44ae-86ea-60a5a91803b0-node-bootstrap-token\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285492 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1f010c-6208-44bd-b36d-95140eaa0cd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7xc9q\" (UID: \"3a1f010c-6208-44bd-b36d-95140eaa0cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285516 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjlwb\" (UniqueName: \"kubernetes.io/projected/c6536bc9-dc65-4acd-975f-87f5621fb0f3-kube-api-access-zjlwb\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285539 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djbdm\" (UniqueName: \"kubernetes.io/projected/19a81c88-752a-4a04-a1f7-70cb357f7be1-kube-api-access-djbdm\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285558 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/357560a9-5851-42f2-b627-a41d831d7f27-config\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285573 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdqns\" (UID: \"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285612 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5vd5\" (UniqueName: \"kubernetes.io/projected/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-kube-api-access-g5vd5\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285677 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285699 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ea5c445-7213-449a-84b3-94ef0ddad18e-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285726 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vrsfr\" (UniqueName: \"kubernetes.io/projected/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-kube-api-access-vrsfr\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285762 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-config-volume\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285784 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c7ffece-fbab-4d6c-a327-f0402649a29e-config\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285819 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19a81c88-752a-4a04-a1f7-70cb357f7be1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.285839 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5r7z\" (UniqueName: \"kubernetes.io/projected/6f6b2566-55ba-4df7-be75-afdc03f5ea73-kube-api-access-q5r7z\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.287633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-certificates\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.287688 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/19a81c88-752a-4a04-a1f7-70cb357f7be1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.287712 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19a81c88-752a-4a04-a1f7-70cb357f7be1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.287735 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b2566-55ba-4df7-be75-afdc03f5ea73-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.288187 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:46.788163939 +0000 UTC m=+245.137109481 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.288760 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-mountpoint-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.288895 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b4e46d6-a655-4649-9656-7c45ea94b38f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.288925 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc39b56a-2b78-4c24-9f99-e1357d76b391-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.289019 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-bound-sa-token\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.289754 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-trusted-ca\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.290742 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/926402ae-efb3-46fa-b415-8333a236c36a-proxy-tls\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.290965 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/900ba5b7-85f6-4924-af0b-61efc2c8598a-metrics-tls\") pod \"dns-operator-744455d44c-qfvdv\" (UID: \"900ba5b7-85f6-4924-af0b-61efc2c8598a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.291106 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-tls\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.291260 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/24b9c7be-22c4-4959-9332-d06229dd3371-metrics-tls\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.291115 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/357560a9-5851-42f2-b627-a41d831d7f27-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.291590 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54a73adb-452b-4db3-9bc2-3411d1575eb5-installation-pull-secrets\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.292105 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/24b9c7be-22c4-4959-9332-d06229dd3371-trusted-ca\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.292761 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/19a81c88-752a-4a04-a1f7-70cb357f7be1-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.298741 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/19a81c88-752a-4a04-a1f7-70cb357f7be1-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.342853 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2x8h\" (UniqueName: \"kubernetes.io/projected/357560a9-5851-42f2-b627-a41d831d7f27-kube-api-access-t2x8h\") pod \"openshift-apiserver-operator-796bbdcf4f-mt2gd\" (UID: \"357560a9-5851-42f2-b627-a41d831d7f27\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.347596 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.355866 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f2txw\" (UniqueName: \"kubernetes.io/projected/926402ae-efb3-46fa-b415-8333a236c36a-kube-api-access-f2txw\") pod \"machine-config-controller-84d6567774-krtbj\" (UID: \"926402ae-efb3-46fa-b415-8333a236c36a\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.381709 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gkhbf\" (UniqueName: \"kubernetes.io/projected/900ba5b7-85f6-4924-af0b-61efc2c8598a-kube-api-access-gkhbf\") pod \"dns-operator-744455d44c-qfvdv\" (UID: \"900ba5b7-85f6-4924-af0b-61efc2c8598a\") " pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393182 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393357 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t429x\" (UniqueName: \"kubernetes.io/projected/63dd2ad3-2637-4d5d-99b8-255a37205e36-kube-api-access-t429x\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393381 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bb55d3af-2526-425e-8a0a-25b779589866-srv-cert\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393400 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffece-fbab-4d6c-a327-f0402649a29e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393420 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8vq9z\" (UniqueName: \"kubernetes.io/projected/bb55d3af-2526-425e-8a0a-25b779589866-kube-api-access-8vq9z\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393436 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b4e46d6-a655-4649-9656-7c45ea94b38f-srv-cert\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393457 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-metrics-certs\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393493 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc39b56a-2b78-4c24-9f99-e1357d76b391-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393514 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/736dd712-8f96-4b9c-bf2b-2f3eb3d4a604-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sw5kt\" (UID: \"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393533 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-csi-data-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393550 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-secret-volume\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393568 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/761e2822-68b0-4ea8-ada6-80f8ce6dec21-proxy-tls\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393610 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4ea5c445-7213-449a-84b3-94ef0ddad18e-tmpfs\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393640 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbh8j\" (UniqueName: \"kubernetes.io/projected/1b4e46d6-a655-4649-9656-7c45ea94b38f-kube-api-access-dbh8j\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393669 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393692 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5xp6\" (UniqueName: \"kubernetes.io/projected/3a1f010c-6208-44bd-b36d-95140eaa0cd7-kube-api-access-f5xp6\") pod \"package-server-manager-789f6589d5-7xc9q\" (UID: \"3a1f010c-6208-44bd-b36d-95140eaa0cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393713 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-default-certificate\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393735 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/761e2822-68b0-4ea8-ada6-80f8ce6dec21-images\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393761 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc39b56a-2b78-4c24-9f99-e1357d76b391-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393781 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dk4wh\" (UniqueName: \"kubernetes.io/projected/4ea5c445-7213-449a-84b3-94ef0ddad18e-kube-api-access-dk4wh\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2mcd\" (UniqueName: \"kubernetes.io/projected/23d53283-d76c-40c3-804e-73fc3431ed98-kube-api-access-l2mcd\") pod \"migrator-59844c95c7-5bjx5\" (UID: \"23d53283-d76c-40c3-804e-73fc3431ed98\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393821 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d566ef89-eb28-44ae-86ea-60a5a91803b0-certs\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393836 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b2566-55ba-4df7-be75-afdc03f5ea73-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393853 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93df02a3-8614-47ff-a1ed-9592ef47d84e-config-volume\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393869 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gjmk6\" (UniqueName: \"kubernetes.io/projected/93df02a3-8614-47ff-a1ed-9592ef47d84e-kube-api-access-gjmk6\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393883 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s7vpc\" (UniqueName: \"kubernetes.io/projected/b972d9ca-6117-4fd7-b488-8c4808f069d4-kube-api-access-s7vpc\") pod \"ingress-canary-524cv\" (UID: \"b972d9ca-6117-4fd7-b488-8c4808f069d4\") " pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393901 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8q5v\" (UniqueName: \"kubernetes.io/projected/c5495eef-efca-4df2-81bb-bd93bb2f8a38-kube-api-access-g8q5v\") pod \"auto-csr-approver-29565264-nvg5v\" (UID: \"c5495eef-efca-4df2-81bb-bd93bb2f8a38\") " pod="openshift-infra/auto-csr-approver-29565264-nvg5v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393918 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mcpb\" (UniqueName: \"kubernetes.io/projected/f27f5c72-19c7-4d66-b927-0eae532ff4fe-kube-api-access-4mcpb\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393942 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d566ef89-eb28-44ae-86ea-60a5a91803b0-node-bootstrap-token\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.393981 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1f010c-6208-44bd-b36d-95140eaa0cd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7xc9q\" (UID: \"3a1f010c-6208-44bd-b36d-95140eaa0cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394008 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zjlwb\" (UniqueName: \"kubernetes.io/projected/c6536bc9-dc65-4acd-975f-87f5621fb0f3-kube-api-access-zjlwb\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394038 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5vd5\" (UniqueName: \"kubernetes.io/projected/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-kube-api-access-g5vd5\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394055 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdqns\" (UID: \"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394073 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394087 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ea5c445-7213-449a-84b3-94ef0ddad18e-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394105 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vrsfr\" (UniqueName: \"kubernetes.io/projected/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-kube-api-access-vrsfr\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394133 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-config-volume\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394149 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c7ffece-fbab-4d6c-a327-f0402649a29e-config\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394165 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5r7z\" (UniqueName: \"kubernetes.io/projected/6f6b2566-55ba-4df7-be75-afdc03f5ea73-kube-api-access-q5r7z\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394183 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b2566-55ba-4df7-be75-afdc03f5ea73-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394203 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-mountpoint-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394224 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b4e46d6-a655-4649-9656-7c45ea94b38f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394241 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc39b56a-2b78-4c24-9f99-e1357d76b391-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394263 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c7ffece-fbab-4d6c-a327-f0402649a29e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394286 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-plugins-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394303 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6536bc9-dc65-4acd-975f-87f5621fb0f3-signing-key\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394320 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-stats-auth\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394337 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kp96v\" (UniqueName: \"kubernetes.io/projected/736dd712-8f96-4b9c-bf2b-2f3eb3d4a604-kube-api-access-kp96v\") pod \"multus-admission-controller-857f4d67dd-sw5kt\" (UID: \"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394353 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93df02a3-8614-47ff-a1ed-9592ef47d84e-metrics-tls\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394381 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6536bc9-dc65-4acd-975f-87f5621fb0f3-signing-cabundle\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394403 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ea5c445-7213-449a-84b3-94ef0ddad18e-webhook-cert\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394420 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bb55d3af-2526-425e-8a0a-25b779589866-profile-collector-cert\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394439 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q6hhf\" (UniqueName: \"kubernetes.io/projected/dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0-kube-api-access-q6hhf\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdqns\" (UID: \"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394464 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k979c\" (UniqueName: \"kubernetes.io/projected/761e2822-68b0-4ea8-ada6-80f8ce6dec21-kube-api-access-k979c\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394482 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-socket-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394497 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-registration-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394516 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/761e2822-68b0-4ea8-ada6-80f8ce6dec21-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394536 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v628j\" (UniqueName: \"kubernetes.io/projected/d566ef89-eb28-44ae-86ea-60a5a91803b0-kube-api-access-v628j\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394554 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rv285\" (UniqueName: \"kubernetes.io/projected/fba67808-dc6d-4f9e-bd53-9185baa79d78-kube-api-access-rv285\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394571 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba67808-dc6d-4f9e-bd53-9185baa79d78-service-ca-bundle\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63dd2ad3-2637-4d5d-99b8-255a37205e36-config\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394604 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b972d9ca-6117-4fd7-b488-8c4808f069d4-cert\") pod \"ingress-canary-524cv\" (UID: \"b972d9ca-6117-4fd7-b488-8c4808f069d4\") " pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.394625 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63dd2ad3-2637-4d5d-99b8-255a37205e36-serving-cert\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.394934 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:46.894908058 +0000 UTC m=+245.243853600 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.395058 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-csi-data-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.395767 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/761e2822-68b0-4ea8-ada6-80f8ce6dec21-images\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.396335 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc39b56a-2b78-4c24-9f99-e1357d76b391-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.397997 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.399341 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/4ea5c445-7213-449a-84b3-94ef0ddad18e-tmpfs\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.399358 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6f6b2566-55ba-4df7-be75-afdc03f5ea73-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.401474 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-mountpoint-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.401702 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-socket-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.402626 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.402707 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-registration-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.403473 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-plugins-dir\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.407187 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/1b4e46d6-a655-4649-9656-7c45ea94b38f-srv-cert\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.408477 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/d566ef89-eb28-44ae-86ea-60a5a91803b0-certs\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.409279 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/d566ef89-eb28-44ae-86ea-60a5a91803b0-node-bootstrap-token\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.411687 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3a1f010c-6208-44bd-b36d-95140eaa0cd7-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-7xc9q\" (UID: \"3a1f010c-6208-44bd-b36d-95140eaa0cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.413397 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/c6536bc9-dc65-4acd-975f-87f5621fb0f3-signing-cabundle\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.416633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/c6536bc9-dc65-4acd-975f-87f5621fb0f3-signing-key\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.418263 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1c7ffece-fbab-4d6c-a327-f0402649a29e-config\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.422601 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/736dd712-8f96-4b9c-bf2b-2f3eb3d4a604-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-sw5kt\" (UID: \"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.424175 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/4ea5c445-7213-449a-84b3-94ef0ddad18e-webhook-cert\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.424298 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/63dd2ad3-2637-4d5d-99b8-255a37205e36-config\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.424610 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/4ea5c445-7213-449a-84b3-94ef0ddad18e-apiservice-cert\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.428473 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/bb55d3af-2526-425e-8a0a-25b779589866-srv-cert\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.430855 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6f6b2566-55ba-4df7-be75-afdc03f5ea73-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.428642 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.431001 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/93df02a3-8614-47ff-a1ed-9592ef47d84e-metrics-tls\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.431453 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/bb55d3af-2526-425e-8a0a-25b779589866-profile-collector-cert\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.431793 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/63dd2ad3-2637-4d5d-99b8-255a37205e36-serving-cert\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.432402 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1c7ffece-fbab-4d6c-a327-f0402649a29e-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.432627 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b972d9ca-6117-4fd7-b488-8c4808f069d4-cert\") pod \"ingress-canary-524cv\" (UID: \"b972d9ca-6117-4fd7-b488-8c4808f069d4\") " pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.432719 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/1b4e46d6-a655-4649-9656-7c45ea94b38f-profile-collector-cert\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.433092 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.433860 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fdxhv\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-kube-api-access-fdxhv\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.435426 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/93df02a3-8614-47ff-a1ed-9592ef47d84e-config-volume\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.435800 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/761e2822-68b0-4ea8-ada6-80f8ce6dec21-proxy-tls\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.436283 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/19a81c88-752a-4a04-a1f7-70cb357f7be1-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.436583 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-secret-volume\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.436717 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-default-certificate\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.436953 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc39b56a-2b78-4c24-9f99-e1357d76b391-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.438135 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdqns\" (UID: \"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.438220 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-config-volume\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.439236 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-metrics-certs\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.440777 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vtzvz\" (UniqueName: \"kubernetes.io/projected/24b9c7be-22c4-4959-9332-d06229dd3371-kube-api-access-vtzvz\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.444756 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fba67808-dc6d-4f9e-bd53-9185baa79d78-service-ca-bundle\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.444773 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/761e2822-68b0-4ea8-ada6-80f8ce6dec21-auth-proxy-config\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.449764 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/fba67808-dc6d-4f9e-bd53-9185baa79d78-stats-auth\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.458836 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8854m\" (UniqueName: \"kubernetes.io/projected/4980eaf1-2428-41fe-8a4f-052aace46947-kube-api-access-8854m\") pod \"downloads-7954f5f757-jnmk9\" (UID: \"4980eaf1-2428-41fe-8a4f-052aace46947\") " pod="openshift-console/downloads-7954f5f757-jnmk9" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.476439 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/24b9c7be-22c4-4959-9332-d06229dd3371-bound-sa-token\") pod \"ingress-operator-5b745b69d9-28fs8\" (UID: \"24b9c7be-22c4-4959-9332-d06229dd3371\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.495905 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.496269 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:46.99625326 +0000 UTC m=+245.345198802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.504261 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djbdm\" (UniqueName: \"kubernetes.io/projected/19a81c88-752a-4a04-a1f7-70cb357f7be1-kube-api-access-djbdm\") pod \"cluster-image-registry-operator-dc59b4c8b-k7bsv\" (UID: \"19a81c88-752a-4a04-a1f7-70cb357f7be1\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.509070 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r9spg"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.515027 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-bound-sa-token\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: W0319 10:25:46.535209 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72f5edb0_c000_4e80_b27d_d0d6023510f8.slice/crio-1620f78379a6434dd79ac5ac6c0de634083e8f3ccf4bf8e7ca5a12738d5e41b3 WatchSource:0}: Error finding container 1620f78379a6434dd79ac5ac6c0de634083e8f3ccf4bf8e7ca5a12738d5e41b3: Status 404 returned error can't find the container with id 1620f78379a6434dd79ac5ac6c0de634083e8f3ccf4bf8e7ca5a12738d5e41b3 Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.555758 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t429x\" (UniqueName: \"kubernetes.io/projected/63dd2ad3-2637-4d5d-99b8-255a37205e36-kube-api-access-t429x\") pod \"service-ca-operator-777779d784-xz4gd\" (UID: \"63dd2ad3-2637-4d5d-99b8-255a37205e36\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.580352 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbh8j\" (UniqueName: \"kubernetes.io/projected/1b4e46d6-a655-4649-9656-7c45ea94b38f-kube-api-access-dbh8j\") pod \"olm-operator-6b444d44fb-2lqmb\" (UID: \"1b4e46d6-a655-4649-9656-7c45ea94b38f\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.597012 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.597409 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.097368086 +0000 UTC m=+245.446313618 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.597583 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.598134 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjmk6\" (UniqueName: \"kubernetes.io/projected/93df02a3-8614-47ff-a1ed-9592ef47d84e-kube-api-access-gjmk6\") pod \"dns-default-tk8m5\" (UID: \"93df02a3-8614-47ff-a1ed-9592ef47d84e\") " pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.598305 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.098296881 +0000 UTC m=+245.447242423 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.629762 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.632951 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1c7ffece-fbab-4d6c-a327-f0402649a29e-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-gg89j\" (UID: \"1c7ffece-fbab-4d6c-a327-f0402649a29e\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.638908 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2mcd\" (UniqueName: \"kubernetes.io/projected/23d53283-d76c-40c3-804e-73fc3431ed98-kube-api-access-l2mcd\") pod \"migrator-59844c95c7-5bjx5\" (UID: \"23d53283-d76c-40c3-804e-73fc3431ed98\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.639737 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.643658 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.645518 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8sk6m"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.654977 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.657723 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-xdm6r"] Mar 19 10:25:46 crc kubenswrapper[4765]: W0319 10:25:46.660764 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod331c5a49_dffb_4c14_ab1b_1b41bfd8f09f.slice/crio-d8a08ad87358385224b7c93c14fb6898124d997385218d2feb739bce98dfe991 WatchSource:0}: Error finding container d8a08ad87358385224b7c93c14fb6898124d997385218d2feb739bce98dfe991: Status 404 returned error can't find the container with id d8a08ad87358385224b7c93c14fb6898124d997385218d2feb739bce98dfe991 Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.662679 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.665095 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dk4wh\" (UniqueName: \"kubernetes.io/projected/4ea5c445-7213-449a-84b3-94ef0ddad18e-kube-api-access-dk4wh\") pod \"packageserver-d55dfcdfc-7jw7x\" (UID: \"4ea5c445-7213-449a-84b3-94ef0ddad18e\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.691474 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5xp6\" (UniqueName: \"kubernetes.io/projected/3a1f010c-6208-44bd-b36d-95140eaa0cd7-kube-api-access-f5xp6\") pod \"package-server-manager-789f6589d5-7xc9q\" (UID: \"3a1f010c-6208-44bd-b36d-95140eaa0cd7\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.699484 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kp96v\" (UniqueName: \"kubernetes.io/projected/736dd712-8f96-4b9c-bf2b-2f3eb3d4a604-kube-api-access-kp96v\") pod \"multus-admission-controller-857f4d67dd-sw5kt\" (UID: \"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.700725 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.700899 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.200877297 +0000 UTC m=+245.549822839 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.701173 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.701569 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.201561386 +0000 UTC m=+245.550506928 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.705267 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-jnmk9" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.710111 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.714888 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" event={"ID":"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f","Type":"ContainerStarted","Data":"d8a08ad87358385224b7c93c14fb6898124d997385218d2feb739bce98dfe991"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.729632 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" event={"ID":"7b848589-8f09-4ff0-b6cc-1e92be8c5c80","Type":"ContainerStarted","Data":"f38159f53a328f647161dc86fe221ff09afd0d13de8cd1c60cb92b1a6d712eb0"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.729673 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" event={"ID":"7b848589-8f09-4ff0-b6cc-1e92be8c5c80","Type":"ContainerStarted","Data":"0e9ae14d5d9916585ecc1c92cd6b7c4759fdc42590c2e564f6845d7850275328"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.729688 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.732926 4765 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-ptr8w container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" start-of-body= Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.733023 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" podUID="7b848589-8f09-4ff0-b6cc-1e92be8c5c80" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.7:8443/healthz\": dial tcp 10.217.0.7:8443: connect: connection refused" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.733448 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" event={"ID":"b0fd12d6-f32c-4f69-a285-8f837e745910","Type":"ContainerStarted","Data":"f74fa38835eff2fa85b7cf55e97b7ab0f7816fdd4791122f9125a14ce9b9c725"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.733517 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" event={"ID":"b0fd12d6-f32c-4f69-a285-8f837e745910","Type":"ContainerStarted","Data":"c83dd0ab40f35f9428d03ad0840362305721c97f823ecb127c4c169e5dfd25df"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.754849 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8q5v\" (UniqueName: \"kubernetes.io/projected/c5495eef-efca-4df2-81bb-bd93bb2f8a38-kube-api-access-g8q5v\") pod \"auto-csr-approver-29565264-nvg5v\" (UID: \"c5495eef-efca-4df2-81bb-bd93bb2f8a38\") " pod="openshift-infra/auto-csr-approver-29565264-nvg5v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.755915 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.756393 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7vpc\" (UniqueName: \"kubernetes.io/projected/b972d9ca-6117-4fd7-b488-8c4808f069d4-kube-api-access-s7vpc\") pod \"ingress-canary-524cv\" (UID: \"b972d9ca-6117-4fd7-b488-8c4808f069d4\") " pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.764258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" event={"ID":"396daff4-daaf-43de-8794-5076381c0d47","Type":"ContainerStarted","Data":"e20951d56abf9d0fbd21cc611ad0a3da1ee699b53c75ef31300643986a9f8743"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.764405 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.764910 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.766095 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjlwb\" (UniqueName: \"kubernetes.io/projected/c6536bc9-dc65-4acd-975f-87f5621fb0f3-kube-api-access-zjlwb\") pod \"service-ca-9c57cc56f-h7csr\" (UID: \"c6536bc9-dc65-4acd-975f-87f5621fb0f3\") " pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.772347 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.775843 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-56zlb"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.775909 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-s58pr"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.776769 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mcpb\" (UniqueName: \"kubernetes.io/projected/f27f5c72-19c7-4d66-b927-0eae532ff4fe-kube-api-access-4mcpb\") pod \"marketplace-operator-79b997595-6wpdr\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.778422 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.784038 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" event={"ID":"72f5edb0-c000-4e80-b27d-d0d6023510f8","Type":"ContainerStarted","Data":"1620f78379a6434dd79ac5ac6c0de634083e8f3ccf4bf8e7ca5a12738d5e41b3"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.788936 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.789565 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.790644 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-94dnk"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.791607 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" event={"ID":"773bc628-94fe-43c5-8247-48c8d510df6a","Type":"ContainerStarted","Data":"eb4c6568e99afd7004760f6dadfc8ead4cf40684cd7e14593886cc49ca84c4ab"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.795280 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.796494 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5vd5\" (UniqueName: \"kubernetes.io/projected/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-kube-api-access-g5vd5\") pod \"collect-profiles-29565255-jmv4v\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.799980 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.801334 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" event={"ID":"a55008bb-1e97-4a50-9fa7-6a43c7edbc29","Type":"ContainerStarted","Data":"3ee69c43794b34dddef3ea423417ec1b132ac4ad8682e206e6f7c82cd9837c35"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.801380 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" event={"ID":"a55008bb-1e97-4a50-9fa7-6a43c7edbc29","Type":"ContainerStarted","Data":"073133ef80d614e3dd5c8e28a5eddf19746256761267c32f4f96d400117ef926"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.802044 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.802215 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.302187648 +0000 UTC m=+245.651133180 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.802389 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.803046 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.303030771 +0000 UTC m=+245.651976313 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.811924 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" event={"ID":"ea55ad1c-3f3c-418d-aed8-915b494eb6fa","Type":"ContainerStarted","Data":"f50b40c38eb8e5079b328e0c69f01be6196847560322c71b738e667e35b5aa74"} Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.820479 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.823828 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8vq9z\" (UniqueName: \"kubernetes.io/projected/bb55d3af-2526-425e-8a0a-25b779589866-kube-api-access-8vq9z\") pod \"catalog-operator-68c6474976-gvklb\" (UID: \"bb55d3af-2526-425e-8a0a-25b779589866\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.824165 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.844607 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rv285\" (UniqueName: \"kubernetes.io/projected/fba67808-dc6d-4f9e-bd53-9185baa79d78-kube-api-access-rv285\") pod \"router-default-5444994796-bj5n6\" (UID: \"fba67808-dc6d-4f9e-bd53-9185baa79d78\") " pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.847303 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.858058 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k979c\" (UniqueName: \"kubernetes.io/projected/761e2822-68b0-4ea8-ada6-80f8ce6dec21-kube-api-access-k979c\") pod \"machine-config-operator-74547568cd-j8smv\" (UID: \"761e2822-68b0-4ea8-ada6-80f8ce6dec21\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.861391 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-7hqvg"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.872824 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd"] Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.873390 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:46 crc kubenswrapper[4765]: W0319 10:25:46.875550 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod39658af6_59cf_48c7_9015_2271021bd64e.slice/crio-889033b7b867a2e8b89847a5fa8c64dadb2f84f573cc02fffe753dae402fd2d8 WatchSource:0}: Error finding container 889033b7b867a2e8b89847a5fa8c64dadb2f84f573cc02fffe753dae402fd2d8: Status 404 returned error can't find the container with id 889033b7b867a2e8b89847a5fa8c64dadb2f84f573cc02fffe753dae402fd2d8 Mar 19 10:25:46 crc kubenswrapper[4765]: W0319 10:25:46.876622 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod926402ae_efb3_46fa_b415_8333a236c36a.slice/crio-eb0d07ba4b79b288f7d44486bb87878ccf547a1d26d195f3a53274bd73edfe75 WatchSource:0}: Error finding container eb0d07ba4b79b288f7d44486bb87878ccf547a1d26d195f3a53274bd73edfe75: Status 404 returned error can't find the container with id eb0d07ba4b79b288f7d44486bb87878ccf547a1d26d195f3a53274bd73edfe75 Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.882125 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q6hhf\" (UniqueName: \"kubernetes.io/projected/dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0-kube-api-access-q6hhf\") pod \"control-plane-machine-set-operator-78cbb6b69f-jdqns\" (UID: \"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.888129 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.894090 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vrsfr\" (UniqueName: \"kubernetes.io/projected/7bb184cb-8063-44da-9eb4-64cc23c9b1f4-kube-api-access-vrsfr\") pod \"csi-hostpathplugin-fl4n5\" (UID: \"7bb184cb-8063-44da-9eb4-64cc23c9b1f4\") " pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.903351 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.903588 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.403559971 +0000 UTC m=+245.752505523 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.903781 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:46 crc kubenswrapper[4765]: E0319 10:25:46.907110 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.407094527 +0000 UTC m=+245.756040119 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.916596 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v628j\" (UniqueName: \"kubernetes.io/projected/d566ef89-eb28-44ae-86ea-60a5a91803b0-kube-api-access-v628j\") pod \"machine-config-server-x9qcs\" (UID: \"d566ef89-eb28-44ae-86ea-60a5a91803b0\") " pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.924332 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" Mar 19 10:25:46 crc kubenswrapper[4765]: W0319 10:25:46.929148 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf32ebd2_1bfc_4da0_959a_abc479034b0b.slice/crio-61970caf9a1a7c6a5d407062bc96ba4bf8a4ec93693c6fae3d0dd135746f4103 WatchSource:0}: Error finding container 61970caf9a1a7c6a5d407062bc96ba4bf8a4ec93693c6fae3d0dd135746f4103: Status 404 returned error can't find the container with id 61970caf9a1a7c6a5d407062bc96ba4bf8a4ec93693c6fae3d0dd135746f4103 Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.929600 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-524cv" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.935372 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/fc39b56a-2b78-4c24-9f99-e1357d76b391-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-phgbf\" (UID: \"fc39b56a-2b78-4c24-9f99-e1357d76b391\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.950164 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-x9qcs" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.955686 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.956886 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5r7z\" (UniqueName: \"kubernetes.io/projected/6f6b2566-55ba-4df7-be75-afdc03f5ea73-kube-api-access-q5r7z\") pod \"kube-storage-version-migrator-operator-b67b599dd-zs4v4\" (UID: \"6f6b2566-55ba-4df7-be75-afdc03f5ea73\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:46 crc kubenswrapper[4765]: W0319 10:25:46.973136 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod993cdcd1_8323_49aa_b587_5a8c344a2077.slice/crio-f1ebc4c8be98a2a70d815a52c142aee13fae7675cadec5a1d9655eeaf5a96db1 WatchSource:0}: Error finding container f1ebc4c8be98a2a70d815a52c142aee13fae7675cadec5a1d9655eeaf5a96db1: Status 404 returned error can't find the container with id f1ebc4c8be98a2a70d815a52c142aee13fae7675cadec5a1d9655eeaf5a96db1 Mar 19 10:25:46 crc kubenswrapper[4765]: I0319 10:25:46.984587 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-tk8m5"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.006503 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.007590 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.507554625 +0000 UTC m=+245.856500167 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.007695 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.008350 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.508341406 +0000 UTC m=+245.857286948 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.035452 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.048057 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.066011 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.113264 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.114486 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.115066 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.615040094 +0000 UTC m=+245.963985636 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.116185 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.136531 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" Mar 19 10:25:47 crc kubenswrapper[4765]: W0319 10:25:47.188267 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19a81c88_752a_4a04_a1f7_70cb357f7be1.slice/crio-d79ebb72005f7b9db2074a3c4d644c9e2e0565baba6b031bd7eafb2951f831c0 WatchSource:0}: Error finding container d79ebb72005f7b9db2074a3c4d644c9e2e0565baba6b031bd7eafb2951f831c0: Status 404 returned error can't find the container with id d79ebb72005f7b9db2074a3c4d644c9e2e0565baba6b031bd7eafb2951f831c0 Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.197083 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.217053 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.217420 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.717402984 +0000 UTC m=+246.066348526 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.227073 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-qfvdv"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.259406 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-h7csr"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.317589 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.317685 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.817662937 +0000 UTC m=+246.166608479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.317717 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.318213 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.818203561 +0000 UTC m=+246.167149103 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: W0319 10:25:47.326755 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod900ba5b7_85f6_4924_af0b_61efc2c8598a.slice/crio-44e2a7502b22950b42d2d77df357b5d594aff056ec7f0db3eb674f8870144277 WatchSource:0}: Error finding container 44e2a7502b22950b42d2d77df357b5d594aff056ec7f0db3eb674f8870144277: Status 404 returned error can't find the container with id 44e2a7502b22950b42d2d77df357b5d594aff056ec7f0db3eb674f8870144277 Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.419337 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.419530 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.919483681 +0000 UTC m=+246.268429233 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.422383 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.422806 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:47.922790882 +0000 UTC m=+246.271736424 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.524139 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.524473 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.024452882 +0000 UTC m=+246.373398424 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.535664 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-z85bn" podStartSLOduration=175.535642767 podStartE2EDuration="2m55.535642767s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:47.532293326 +0000 UTC m=+245.881238868" watchObservedRunningTime="2026-03-19 10:25:47.535642767 +0000 UTC m=+245.884588309" Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.544231 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-jnmk9"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.625683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.626087 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.126073972 +0000 UTC m=+246.475019514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.726978 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.727222 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.227170857 +0000 UTC m=+246.576116399 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.727801 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.728236 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.228218016 +0000 UTC m=+246.577163558 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: W0319 10:25:47.802152 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd566ef89_eb28_44ae_86ea_60a5a91803b0.slice/crio-ecb067102cf2c981574e12e9972e4c68044fc86b2b6d90d2c3a90092abfa4d6c WatchSource:0}: Error finding container ecb067102cf2c981574e12e9972e4c68044fc86b2b6d90d2c3a90092abfa4d6c: Status 404 returned error can't find the container with id ecb067102cf2c981574e12e9972e4c68044fc86b2b6d90d2c3a90092abfa4d6c Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.833921 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.834213 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.334180194 +0000 UTC m=+246.683125746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.853865 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.854240 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.35422453 +0000 UTC m=+246.703170072 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.877337 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpdr"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.879396 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.890018 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.904641 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.914112 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb"] Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.914192 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" event={"ID":"a55008bb-1e97-4a50-9fa7-6a43c7edbc29","Type":"ContainerStarted","Data":"a89f5c8d4ec55713f7690ce139ec84d1e533224e01ff78875c9981f6c1711ef8"} Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.922942 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-94dnk" event={"ID":"39658af6-59cf-48c7-9015-2271021bd64e","Type":"ContainerStarted","Data":"889033b7b867a2e8b89847a5fa8c64dadb2f84f573cc02fffe753dae402fd2d8"} Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.952223 4765 generic.go:334] "Generic (PLEG): container finished" podID="773bc628-94fe-43c5-8247-48c8d510df6a" containerID="f26d5af981f7080c2e57791bfbf74feed66d7a60ad6574703ebdf0337d6c788a" exitCode=0 Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.954084 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" event={"ID":"773bc628-94fe-43c5-8247-48c8d510df6a","Type":"ContainerDied","Data":"f26d5af981f7080c2e57791bfbf74feed66d7a60ad6574703ebdf0337d6c788a"} Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.954562 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:47 crc kubenswrapper[4765]: E0319 10:25:47.955017 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.454996496 +0000 UTC m=+246.803942038 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.994719 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" event={"ID":"ea55ad1c-3f3c-418d-aed8-915b494eb6fa","Type":"ContainerStarted","Data":"7f0100480f054c633cd9a2a890c2e269274642c7d1f22bd3380c5248cdcb5e02"} Mar 19 10:25:47 crc kubenswrapper[4765]: I0319 10:25:47.995408 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.002073 4765 generic.go:334] "Generic (PLEG): container finished" podID="396daff4-daaf-43de-8794-5076381c0d47" containerID="9a2999d2775257324543a107e6bcdb0d1bb384358f921a5abd497cf3e2bfb628" exitCode=0 Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.002184 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" event={"ID":"396daff4-daaf-43de-8794-5076381c0d47","Type":"ContainerDied","Data":"9a2999d2775257324543a107e6bcdb0d1bb384358f921a5abd497cf3e2bfb628"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.010793 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" event={"ID":"72f5edb0-c000-4e80-b27d-d0d6023510f8","Type":"ContainerStarted","Data":"e4cbb1057b82fe99c1f796e131ad41f1a7123b069737d175ff6f2885b2d3c1c3"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.013303 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.018181 4765 patch_prober.go:28] interesting pod/console-operator-58897d9998-xdm6r container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" start-of-body= Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.018247 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" podUID="ea55ad1c-3f3c-418d-aed8-915b494eb6fa" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.8:8443/readyz\": dial tcp 10.217.0.8:8443: connect: connection refused" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.020990 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" event={"ID":"900ba5b7-85f6-4924-af0b-61efc2c8598a","Type":"ContainerStarted","Data":"44e2a7502b22950b42d2d77df357b5d594aff056ec7f0db3eb674f8870144277"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.023947 4765 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-r9spg container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" start-of-body= Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.024044 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" podUID="72f5edb0-c000-4e80-b27d-d0d6023510f8" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.23:8443/healthz\": dial tcp 10.217.0.23:8443: connect: connection refused" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.025379 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" event={"ID":"f418b4ca-da05-4139-b8d9-5614419e936b","Type":"ContainerStarted","Data":"4c150e018f6b434ea497b276ebe7b4e18a84d358c26993235147e14ad7e2541d"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.033177 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" event={"ID":"993cdcd1-8323-49aa-b587-5a8c344a2077","Type":"ContainerStarted","Data":"f1ebc4c8be98a2a70d815a52c142aee13fae7675cadec5a1d9655eeaf5a96db1"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.046932 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" event={"ID":"3bde734c-df56-471b-8a70-2f555a974e57","Type":"ContainerStarted","Data":"2a3c62442e8a0b0e919091e7b9edcf8661b612fbedf0a1dde133479485575776"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.063554 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.065429 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.565406755 +0000 UTC m=+246.914352297 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.074080 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" event={"ID":"401e164a-fc29-412f-ab6e-1c911f6c2d0a","Type":"ContainerStarted","Data":"83c94011a88a322366466bfb60b01d84b0ac573f188cb18f9b21b7d73c95ab8f"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.078175 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" event={"ID":"19a81c88-752a-4a04-a1f7-70cb357f7be1","Type":"ContainerStarted","Data":"d79ebb72005f7b9db2074a3c4d644c9e2e0565baba6b031bd7eafb2951f831c0"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.081463 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" event={"ID":"926402ae-efb3-46fa-b415-8333a236c36a","Type":"ContainerStarted","Data":"7200ed7ac03367e10925a7f3804479b60661e3f7d5f6de696b4fd46c280fdb33"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.081553 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" event={"ID":"926402ae-efb3-46fa-b415-8333a236c36a","Type":"ContainerStarted","Data":"eb0d07ba4b79b288f7d44486bb87878ccf547a1d26d195f3a53274bd73edfe75"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.083213 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" event={"ID":"df32ebd2-1bfc-4da0-959a-abc479034b0b","Type":"ContainerStarted","Data":"61970caf9a1a7c6a5d407062bc96ba4bf8a4ec93693c6fae3d0dd135746f4103"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.088274 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tk8m5" event={"ID":"93df02a3-8614-47ff-a1ed-9592ef47d84e","Type":"ContainerStarted","Data":"928b6c0fcb100e43f6d118eedb816b67fa1fcf06088e18a0513e285f6f5e5da1"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.102146 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" podStartSLOduration=175.102121576 podStartE2EDuration="2m55.102121576s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:48.097207682 +0000 UTC m=+246.446153224" watchObservedRunningTime="2026-03-19 10:25:48.102121576 +0000 UTC m=+246.451067118" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.107223 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" event={"ID":"357560a9-5851-42f2-b627-a41d831d7f27","Type":"ContainerStarted","Data":"9c86dcb215cceeb62e566cba07d6891e88292e4991adcd06d4386f43931bfc7b"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.107276 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" event={"ID":"357560a9-5851-42f2-b627-a41d831d7f27","Type":"ContainerStarted","Data":"49332acf1fc330eaddd343810285c485fe03e78fdf8cf37b2b0539e0f934ff56"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.113343 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bj5n6" event={"ID":"fba67808-dc6d-4f9e-bd53-9185baa79d78","Type":"ContainerStarted","Data":"63ce04881745e1b8551e0921dbe4df0a573b31a9a200574b3516ff21c898f027"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.117284 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x9qcs" event={"ID":"d566ef89-eb28-44ae-86ea-60a5a91803b0","Type":"ContainerStarted","Data":"ecb067102cf2c981574e12e9972e4c68044fc86b2b6d90d2c3a90092abfa4d6c"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.121747 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" event={"ID":"24b9c7be-22c4-4959-9332-d06229dd3371","Type":"ContainerStarted","Data":"89fdf77b09aefaf78f5688b3b981641cb96f0388d53dec52317c0f46955027c5"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.142386 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" event={"ID":"93cd26ea-e56f-4bb3-9bae-1e9b552480d8","Type":"ContainerStarted","Data":"d00d12780361491bc5b5061c8590dcdd998b27d400bd1d81b7706fafa01b740a"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.142454 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" event={"ID":"93cd26ea-e56f-4bb3-9bae-1e9b552480d8","Type":"ContainerStarted","Data":"467dc5886e9ebe9bfa98a9ae9b0e76fd835973d9b0ba75ca9423ccc10312b665"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.158329 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" event={"ID":"c6536bc9-dc65-4acd-975f-87f5621fb0f3","Type":"ContainerStarted","Data":"4c0e67ab1b10c0ee73470818752420ba985d407d568476b7c30a59727df8d00f"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.168064 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.169514 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.669483912 +0000 UTC m=+247.018429454 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.175500 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jnmk9" event={"ID":"4980eaf1-2428-41fe-8a4f-052aace46947","Type":"ContainerStarted","Data":"ba1f3af8cc7d2500320692fee4a2867f99bdb87f09fde0d3a67203219d4d47c0"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.236904 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" event={"ID":"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f","Type":"ContainerStarted","Data":"51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a"} Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.238637 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:48 crc kubenswrapper[4765]: W0319 10:25:48.261256 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23d53283_d76c_40c3_804e_73fc3431ed98.slice/crio-75b46d1df4633eae71d703f4f6b39b35be6d954e826450bb2f5c4d2c6761b7ab WatchSource:0}: Error finding container 75b46d1df4633eae71d703f4f6b39b35be6d954e826450bb2f5c4d2c6761b7ab: Status 404 returned error can't find the container with id 75b46d1df4633eae71d703f4f6b39b35be6d954e826450bb2f5c4d2c6761b7ab Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.265548 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.273001 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.274348 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.77432999 +0000 UTC m=+247.123275532 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.377829 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.385042 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.885008356 +0000 UTC m=+247.233953898 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.385745 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.398304 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.898280008 +0000 UTC m=+247.247225550 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.439055 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.466071 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.473131 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.481507 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-sw5kt"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.493602 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.495157 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:48.995131587 +0000 UTC m=+247.344077129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.585745 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-nvg5v"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.597421 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.600235 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.600763 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.100736875 +0000 UTC m=+247.449682417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.635847 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.637665 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-c96ts" podStartSLOduration=176.637638491 podStartE2EDuration="2m56.637638491s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:48.583492225 +0000 UTC m=+246.932437777" watchObservedRunningTime="2026-03-19 10:25:48.637638491 +0000 UTC m=+246.986584033" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.649922 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.656783 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" podStartSLOduration=175.656751642 podStartE2EDuration="2m55.656751642s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:48.630591449 +0000 UTC m=+246.979537001" watchObservedRunningTime="2026-03-19 10:25:48.656751642 +0000 UTC m=+247.005697184" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.682281 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-524cv"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.688540 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" podStartSLOduration=176.688508977 podStartE2EDuration="2m56.688508977s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:48.678545256 +0000 UTC m=+247.027490798" watchObservedRunningTime="2026-03-19 10:25:48.688508977 +0000 UTC m=+247.037454509" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.702669 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.703114 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.203089385 +0000 UTC m=+247.552034927 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.708593 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-vc8ns" podStartSLOduration=176.708569144 podStartE2EDuration="2m56.708569144s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:48.706471117 +0000 UTC m=+247.055416659" watchObservedRunningTime="2026-03-19 10:25:48.708569144 +0000 UTC m=+247.057514706" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.725668 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.741888 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-mt2gd" podStartSLOduration=176.741862711 podStartE2EDuration="2m56.741862711s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:48.736032813 +0000 UTC m=+247.084978365" watchObservedRunningTime="2026-03-19 10:25:48.741862711 +0000 UTC m=+247.090808253" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.778563 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.784614 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" podStartSLOduration=176.784591396 podStartE2EDuration="2m56.784591396s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:48.768676572 +0000 UTC m=+247.117622114" watchObservedRunningTime="2026-03-19 10:25:48.784591396 +0000 UTC m=+247.133536928" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.784799 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-fl4n5"] Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.804191 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.804677 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.304661983 +0000 UTC m=+247.653607525 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.813968 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf"] Mar 19 10:25:48 crc kubenswrapper[4765]: W0319 10:25:48.829595 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb972d9ca_6117_4fd7_b488_8c4808f069d4.slice/crio-03b9e3773b00a3062cbe076b5397c9ab281ab7a4a32ac3ab47aa194e4c0cf7ea WatchSource:0}: Error finding container 03b9e3773b00a3062cbe076b5397c9ab281ab7a4a32ac3ab47aa194e4c0cf7ea: Status 404 returned error can't find the container with id 03b9e3773b00a3062cbe076b5397c9ab281ab7a4a32ac3ab47aa194e4c0cf7ea Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.871156 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52108: no serving certificate available for the kubelet" Mar 19 10:25:48 crc kubenswrapper[4765]: W0319 10:25:48.882898 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb55d3af_2526_425e_8a0a_25b779589866.slice/crio-b93c0e06cd34e8d7908628f320a9eebdd39381ceb1f21265973990086b7fdebb WatchSource:0}: Error finding container b93c0e06cd34e8d7908628f320a9eebdd39381ceb1f21265973990086b7fdebb: Status 404 returned error can't find the container with id b93c0e06cd34e8d7908628f320a9eebdd39381ceb1f21265973990086b7fdebb Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.905569 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:48 crc kubenswrapper[4765]: E0319 10:25:48.906020 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.406003605 +0000 UTC m=+247.754949147 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.953548 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:25:48 crc kubenswrapper[4765]: I0319 10:25:48.987477 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52124: no serving certificate available for the kubelet" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.017264 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.017792 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.517773181 +0000 UTC m=+247.866718733 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.076606 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52132: no serving certificate available for the kubelet" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.119484 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.119534 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.619495713 +0000 UTC m=+247.968441255 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.120270 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.120791 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.620758868 +0000 UTC m=+247.969704410 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.185614 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52140: no serving certificate available for the kubelet" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.221613 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.222110 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.722086368 +0000 UTC m=+248.071031910 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.256460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" event={"ID":"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604","Type":"ContainerStarted","Data":"c65dabf090265d6254dd28e4b04d9253cb366dcaf8ce3113b686911a94e5ef42"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.261805 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-94dnk" event={"ID":"39658af6-59cf-48c7-9015-2271021bd64e","Type":"ContainerStarted","Data":"c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.268669 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-524cv" event={"ID":"b972d9ca-6117-4fd7-b488-8c4808f069d4","Type":"ContainerStarted","Data":"03b9e3773b00a3062cbe076b5397c9ab281ab7a4a32ac3ab47aa194e4c0cf7ea"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.274196 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" event={"ID":"1c7ffece-fbab-4d6c-a327-f0402649a29e","Type":"ContainerStarted","Data":"06446d315db6266ad6bfc4a54e2449879c76f746f5f3288f8ebcb82b665168eb"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.292386 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52152: no serving certificate available for the kubelet" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.297400 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-jnmk9" event={"ID":"4980eaf1-2428-41fe-8a4f-052aace46947","Type":"ContainerStarted","Data":"5216267aca0b8439f6c0334ebe37eccd43700d10a91eb395d531cdac1c824cee"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.298596 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-jnmk9" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.303397 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" event={"ID":"3a1f010c-6208-44bd-b36d-95140eaa0cd7","Type":"ContainerStarted","Data":"01fe36d7f29c4434dbb78eea5ae368f2da7ac08cf83db033fdb356155bc23e69"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.303462 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" event={"ID":"3a1f010c-6208-44bd-b36d-95140eaa0cd7","Type":"ContainerStarted","Data":"ed73b56eabf4286a86e639f869e6ef1a0c21068548d185e36e6409a7cb15776d"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.305629 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-jnmk9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.305686 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jnmk9" podUID="4980eaf1-2428-41fe-8a4f-052aace46947" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.326396 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" event={"ID":"c6536bc9-dc65-4acd-975f-87f5621fb0f3","Type":"ContainerStarted","Data":"eff7bd4bca362fd1f912239e5dda7f97f3317facc5bd8b01e46c0204e447ac85"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.327880 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.329515 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.829253249 +0000 UTC m=+248.178198861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.334423 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-94dnk" podStartSLOduration=177.334399459 podStartE2EDuration="2m57.334399459s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.300274089 +0000 UTC m=+247.649219631" watchObservedRunningTime="2026-03-19 10:25:49.334399459 +0000 UTC m=+247.683345001" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.335694 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-jnmk9" podStartSLOduration=177.335687384 podStartE2EDuration="2m57.335687384s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.335493409 +0000 UTC m=+247.684438971" watchObservedRunningTime="2026-03-19 10:25:49.335687384 +0000 UTC m=+247.684632946" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.345371 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" event={"ID":"1b4e46d6-a655-4649-9656-7c45ea94b38f","Type":"ContainerStarted","Data":"c3aa4e0aec6a6a7e2469dc8822b988f879ac19b813921fada88849c4b2409cc5"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.365324 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-h7csr" podStartSLOduration=176.365297161 podStartE2EDuration="2m56.365297161s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.361562099 +0000 UTC m=+247.710507641" watchObservedRunningTime="2026-03-19 10:25:49.365297161 +0000 UTC m=+247.714242703" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.387933 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" event={"ID":"7bb184cb-8063-44da-9eb4-64cc23c9b1f4","Type":"ContainerStarted","Data":"9a2e8d1dde841875d73cb770a9a94b8d7e7de5b8ddbf78c4c0c8f40b7d4e745a"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.426598 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52164: no serving certificate available for the kubelet" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.432546 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.434729 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.934694852 +0000 UTC m=+248.283640394 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.435873 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.436418 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:49.936409559 +0000 UTC m=+248.285355101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.451191 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" event={"ID":"19a81c88-752a-4a04-a1f7-70cb357f7be1","Type":"ContainerStarted","Data":"17eee41ef07682b9747c12dc06a309083cadc724443901b572b7ca6577c44db1"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.474628 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" event={"ID":"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0","Type":"ContainerStarted","Data":"3deac53223e89166517b6d3a2f0aa547453068535b27bc729916c4bec6b06b2b"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.485935 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" event={"ID":"23d53283-d76c-40c3-804e-73fc3431ed98","Type":"ContainerStarted","Data":"75b46d1df4633eae71d703f4f6b39b35be6d954e826450bb2f5c4d2c6761b7ab"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.494215 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bde734c-df56-471b-8a70-2f555a974e57" containerID="b29576c739d0c2c1dc5cc8a306f3b61a0af2f4b07ff49dea16dfb784f3c13cfe" exitCode=0 Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.494298 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" event={"ID":"3bde734c-df56-471b-8a70-2f555a974e57","Type":"ContainerDied","Data":"b29576c739d0c2c1dc5cc8a306f3b61a0af2f4b07ff49dea16dfb784f3c13cfe"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.497728 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" event={"ID":"761e2822-68b0-4ea8-ada6-80f8ce6dec21","Type":"ContainerStarted","Data":"216e817be5ffcae56ad37786bed212602b1e9f719a53ab1e971de60c40269b29"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.499513 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-x9qcs" event={"ID":"d566ef89-eb28-44ae-86ea-60a5a91803b0","Type":"ContainerStarted","Data":"3ef553be394147b9ec295b7323e1e85961be3efa7d42d47fa0ff8c9d92afb042"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.507128 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-k7bsv" podStartSLOduration=176.507108296 podStartE2EDuration="2m56.507108296s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.505315277 +0000 UTC m=+247.854260839" watchObservedRunningTime="2026-03-19 10:25:49.507108296 +0000 UTC m=+247.856053838" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.538654 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.540299 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.04028263 +0000 UTC m=+248.389228172 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.543823 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" event={"ID":"24b9c7be-22c4-4959-9332-d06229dd3371","Type":"ContainerStarted","Data":"a67466d0fb7e5808a4915d9dbe7238c1773e387920ca0a2495e1677d126b1a2d"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.567829 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" event={"ID":"6f6b2566-55ba-4df7-be75-afdc03f5ea73","Type":"ContainerStarted","Data":"c98860ad29358b6cab7c50a9d8c86393a6d65be19c3b05d4d08e4d0de949dfc4"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.585131 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" event={"ID":"c5495eef-efca-4df2-81bb-bd93bb2f8a38","Type":"ContainerStarted","Data":"f201b92fb1a091f3f8e0794890369c4646767fdbe69f2df2861ab850ba26bd42"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.605419 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" event={"ID":"401e164a-fc29-412f-ab6e-1c911f6c2d0a","Type":"ContainerStarted","Data":"cda267e2aa43cd450829a7f9602e56db8db101ae3799ced658b4328d77c66cc7"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.614832 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52176: no serving certificate available for the kubelet" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.621516 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-x9qcs" podStartSLOduration=6.621478503 podStartE2EDuration="6.621478503s" podCreationTimestamp="2026-03-19 10:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.620779614 +0000 UTC m=+247.969725156" watchObservedRunningTime="2026-03-19 10:25:49.621478503 +0000 UTC m=+247.970424045" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.625473 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-bj5n6" event={"ID":"fba67808-dc6d-4f9e-bd53-9185baa79d78","Type":"ContainerStarted","Data":"889de415b34ce78bb238098849aecbec1db01384fe3740fed0107c59b843df91"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.632825 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" event={"ID":"fc39b56a-2b78-4c24-9f99-e1357d76b391","Type":"ContainerStarted","Data":"f90b65b24d4aa8b53b46b84e6e2345e25b5abd98c1919da6f325f87862ce6b2e"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.633810 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" event={"ID":"4ea5c445-7213-449a-84b3-94ef0ddad18e","Type":"ContainerStarted","Data":"a5125a4ec01479d04de24212e91c71ed7da83703f0f7b3e153124a827099dac1"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.635214 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" event={"ID":"993cdcd1-8323-49aa-b587-5a8c344a2077","Type":"ContainerStarted","Data":"f64a134753ce436c18afc290dfdbd19b070b4651e5fabfb28d8330210ab1083c"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.636845 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" event={"ID":"bb55d3af-2526-425e-8a0a-25b779589866","Type":"ContainerStarted","Data":"b93c0e06cd34e8d7908628f320a9eebdd39381ceb1f21265973990086b7fdebb"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.641271 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.644192 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.144153321 +0000 UTC m=+248.493098863 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.647836 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" event={"ID":"396daff4-daaf-43de-8794-5076381c0d47","Type":"ContainerStarted","Data":"aceee7d19cb86c57ca5f2716f954cca12661188a4747e28a729e75d8615d1311"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.669975 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" event={"ID":"b8a97d83-18b0-42eb-9ed9-f49ffff3d034","Type":"ContainerStarted","Data":"d4d1c451020ab37fdc094102f89fe0859a6463287c23990490b782119d3199a6"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.745491 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" event={"ID":"63dd2ad3-2637-4d5d-99b8-255a37205e36","Type":"ContainerStarted","Data":"41401bd57de44ef171ecbd32ca3e23e7f6b95c1c1a4699fabe3aa325de23d254"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.745984 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.749507 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.249471831 +0000 UTC m=+248.598417373 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.762498 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52180: no serving certificate available for the kubelet" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.780799 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" event={"ID":"f418b4ca-da05-4139-b8d9-5614419e936b","Type":"ContainerStarted","Data":"48c7fcd0dc52fb6d5b1a8710712a5efaf646dc462bfb2b68ea488a61ed4b2e31"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.800351 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" event={"ID":"f27f5c72-19c7-4d66-b927-0eae532ff4fe","Type":"ContainerStarted","Data":"29c3ded5eb301c37b9a131acbf93ff931ae30a728a251a8c9e530426184fc0f7"} Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.801513 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.812681 4765 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6wpdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.812746 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.817391 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-xdm6r" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.817581 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.819057 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" podStartSLOduration=176.819043217 podStartE2EDuration="2m56.819043217s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.817482255 +0000 UTC m=+248.166427807" watchObservedRunningTime="2026-03-19 10:25:49.819043217 +0000 UTC m=+248.167988749" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.819928 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-bj5n6" podStartSLOduration=176.819923391 podStartE2EDuration="2m56.819923391s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.745155614 +0000 UTC m=+248.094101156" watchObservedRunningTime="2026-03-19 10:25:49.819923391 +0000 UTC m=+248.168868933" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.853865 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.854448 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" start-of-body= Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.854514 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="Get \"http://localhost:1936/healthz/ready\": dial tcp [::1]:1936: connect: connection refused" Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.855053 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.855489 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.35546915 +0000 UTC m=+248.704414692 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.959770 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:49 crc kubenswrapper[4765]: E0319 10:25:49.962303 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.462272361 +0000 UTC m=+248.811217903 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:49 crc kubenswrapper[4765]: I0319 10:25:49.990671 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-xhwkd" podStartSLOduration=176.990649634 podStartE2EDuration="2m56.990649634s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:49.987426156 +0000 UTC m=+248.336371698" watchObservedRunningTime="2026-03-19 10:25:49.990649634 +0000 UTC m=+248.339595176" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.027133 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52196: no serving certificate available for the kubelet" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.067447 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.067825 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.567809147 +0000 UTC m=+248.916754689 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.169826 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.170275 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.670251929 +0000 UTC m=+249.019197471 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.273427 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.274095 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.774068548 +0000 UTC m=+249.123014090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.391941 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.392525 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.892462235 +0000 UTC m=+249.241407777 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.473182 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" podStartSLOduration=177.473129394 podStartE2EDuration="2m57.473129394s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:50.17544027 +0000 UTC m=+248.524385832" watchObservedRunningTime="2026-03-19 10:25:50.473129394 +0000 UTC m=+248.822074936" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.475894 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r9spg"] Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.493816 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.494326 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:50.994310071 +0000 UTC m=+249.343255613 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.496610 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w"] Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.595088 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.595581 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.09553209 +0000 UTC m=+249.444477632 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.595774 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.596301 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.09629033 +0000 UTC m=+249.445235872 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.697903 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.698190 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.198167617 +0000 UTC m=+249.547113159 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.700362 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.700760 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.200745477 +0000 UTC m=+249.549691029 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.801663 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.801993 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.301944805 +0000 UTC m=+249.650890347 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.802200 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.802585 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.302574952 +0000 UTC m=+249.651520494 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.825621 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.825673 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.847944 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.856251 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:50 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:50 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:50 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.856337 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.856743 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" event={"ID":"23d53283-d76c-40c3-804e-73fc3431ed98","Type":"ContainerStarted","Data":"f2c5fa7ae01fcb198525de44d1ddaf780c67a3e26b3ad348a3e4fbb08c76a571"} Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.871694 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tk8m5" event={"ID":"93df02a3-8614-47ff-a1ed-9592ef47d84e","Type":"ContainerStarted","Data":"9f9fa7570cccc496f474adda8f42538529bff8aadf15d85b19b73f96412cf095"} Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.904689 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" event={"ID":"4ea5c445-7213-449a-84b3-94ef0ddad18e","Type":"ContainerStarted","Data":"90e2abcbbdf721acd7af5862fc6ffc990d3049583483f9dd08b9727e97b71880"} Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.906532 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.906947 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:50 crc kubenswrapper[4765]: E0319 10:25:50.907870 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.407846241 +0000 UTC m=+249.756791793 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.914365 4765 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-7jw7x container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" start-of-body= Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.914464 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" podUID="4ea5c445-7213-449a-84b3-94ef0ddad18e" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.34:5443/healthz\": dial tcp 10.217.0.34:5443: connect: connection refused" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.934295 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" event={"ID":"f27f5c72-19c7-4d66-b927-0eae532ff4fe","Type":"ContainerStarted","Data":"d5157a82387bb5080cd401571c48cc1a277e4ccbfe656a5a755dd8c507080b2a"} Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.935195 4765 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6wpdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.935234 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.975555 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" podStartSLOduration=177.975523606 podStartE2EDuration="2m57.975523606s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:50.972831602 +0000 UTC m=+249.321777144" watchObservedRunningTime="2026-03-19 10:25:50.975523606 +0000 UTC m=+249.324469148" Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.978453 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" event={"ID":"900ba5b7-85f6-4924-af0b-61efc2c8598a","Type":"ContainerStarted","Data":"f24ce6cff144cfb953ea7cc347e50c10f10a3fd56cf75856d0044b5b28aa6e39"} Mar 19 10:25:50 crc kubenswrapper[4765]: I0319 10:25:50.992284 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-524cv" event={"ID":"b972d9ca-6117-4fd7-b488-8c4808f069d4","Type":"ContainerStarted","Data":"7af715497371008a9e43daf3b39258266e700f8a2b4cfd81eab76d45ab242180"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.009791 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.011222 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.511207558 +0000 UTC m=+249.860153090 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.026596 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-524cv" podStartSLOduration=8.026576037 podStartE2EDuration="8.026576037s" podCreationTimestamp="2026-03-19 10:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:51.024760678 +0000 UTC m=+249.373706230" watchObservedRunningTime="2026-03-19 10:25:51.026576037 +0000 UTC m=+249.375521579" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.080336 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" event={"ID":"b8a97d83-18b0-42eb-9ed9-f49ffff3d034","Type":"ContainerStarted","Data":"c2a3fc499763766084e1f741340244299ced9cfe6bea53084e18f4fc9d4e9a8a"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.114840 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" event={"ID":"773bc628-94fe-43c5-8247-48c8d510df6a","Type":"ContainerStarted","Data":"ca64ef21c8ccfaef29e78f295884a062507b8ebf5fe0d6742fc430ed418253d9"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.114887 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.120075 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.120727 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.620703482 +0000 UTC m=+249.969649024 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.121220 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.122581 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.622568403 +0000 UTC m=+249.971513945 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.148031 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" event={"ID":"1b4e46d6-a655-4649-9656-7c45ea94b38f","Type":"ContainerStarted","Data":"5bccddd5e49b32d0652fccd970a5c459eeadc42faaac075c170199b3e4ee04cf"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.148655 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.172137 4765 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-2lqmb container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" start-of-body= Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.172204 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" podUID="1b4e46d6-a655-4649-9656-7c45ea94b38f" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.41:8443/healthz\": dial tcp 10.217.0.41:8443: connect: connection refused" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.200532 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" event={"ID":"926402ae-efb3-46fa-b415-8333a236c36a","Type":"ContainerStarted","Data":"9e625690b78c38a6b1556f2498b01dd6db3980ca6eeaa140217177a9c9e1b3f4"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.219383 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" event={"ID":"1c7ffece-fbab-4d6c-a327-f0402649a29e","Type":"ContainerStarted","Data":"94c212d108757407b9ce0d6e4ea08616ab4bc701a22ad8e39d3b33b9191f80eb"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.225178 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.226701 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" podStartSLOduration=179.2266665 podStartE2EDuration="2m59.2266665s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:51.164362942 +0000 UTC m=+249.513308494" watchObservedRunningTime="2026-03-19 10:25:51.2266665 +0000 UTC m=+249.575612042" Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.227077 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.727058711 +0000 UTC m=+250.076004253 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.229074 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" event={"ID":"63dd2ad3-2637-4d5d-99b8-255a37205e36","Type":"ContainerStarted","Data":"9ddb841271e107ff01e794b9c5aba6a0c3526e41528c78a7782e4243766fc558"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.245637 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" event={"ID":"993cdcd1-8323-49aa-b587-5a8c344a2077","Type":"ContainerStarted","Data":"4625bc274942134416f9a497d866bfbcfec8abf49b8f29656b0663ca351e76a3"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.258817 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" event={"ID":"dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0","Type":"ContainerStarted","Data":"5a3f39838aed2a4e958bf2413a0142b83b476fd6e29c1641117789ad81e2b6a1"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.274216 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" podStartSLOduration=179.274191996 podStartE2EDuration="2m59.274191996s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:51.227707129 +0000 UTC m=+249.576652691" watchObservedRunningTime="2026-03-19 10:25:51.274191996 +0000 UTC m=+249.623137528" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.284492 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" event={"ID":"df32ebd2-1bfc-4da0-959a-abc479034b0b","Type":"ContainerStarted","Data":"5c13b00e29efbae0e5b19ead7e0b7ac3424aaa5d4d9db8a7846b8f510ff1b435"} Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.289363 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" podUID="7b848589-8f09-4ff0-b6cc-1e92be8c5c80" containerName="route-controller-manager" containerID="cri-o://f38159f53a328f647161dc86fe221ff09afd0d13de8cd1c60cb92b1a6d712eb0" gracePeriod=30 Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.294197 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-jnmk9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.294283 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jnmk9" podUID="4980eaf1-2428-41fe-8a4f-052aace46947" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.310444 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" podStartSLOduration=178.310416043 podStartE2EDuration="2m58.310416043s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:51.277273529 +0000 UTC m=+249.626219071" watchObservedRunningTime="2026-03-19 10:25:51.310416043 +0000 UTC m=+249.659361575" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.311518 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-jdqns" podStartSLOduration=178.311512663 podStartE2EDuration="2m58.311512663s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:51.306195088 +0000 UTC m=+249.655140630" watchObservedRunningTime="2026-03-19 10:25:51.311512663 +0000 UTC m=+249.660458205" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.324438 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-4tmv9" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.327169 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.331989 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.83197089 +0000 UTC m=+250.180916432 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.435327 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.436202 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:51.93618091 +0000 UTC m=+250.285126452 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.545269 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.545746 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.045728086 +0000 UTC m=+250.394673628 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.646838 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.647920 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.14789819 +0000 UTC m=+250.496843732 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.751040 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.751595 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.251564486 +0000 UTC m=+250.600510028 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.852693 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.852912 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.352870037 +0000 UTC m=+250.701815589 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.853125 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.853662 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.353640137 +0000 UTC m=+250.702585679 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.862702 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:51 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:51 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:51 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.862777 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.954355 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:51 crc kubenswrapper[4765]: E0319 10:25:51.955535 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.455512924 +0000 UTC m=+250.804458466 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:51 crc kubenswrapper[4765]: I0319 10:25:51.959679 4765 ???:1] "http: TLS handshake error from 192.168.126.11:52198: no serving certificate available for the kubelet" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.028712 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-7hqvg" podStartSLOduration=179.028689148 podStartE2EDuration="2m59.028689148s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:51.80751232 +0000 UTC m=+250.156457862" watchObservedRunningTime="2026-03-19 10:25:52.028689148 +0000 UTC m=+250.377634690" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.064870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.065311 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.565295986 +0000 UTC m=+250.914241528 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.090383 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-gg89j" podStartSLOduration=179.090362309 podStartE2EDuration="2m59.090362309s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:52.031488845 +0000 UTC m=+250.380434397" watchObservedRunningTime="2026-03-19 10:25:52.090362309 +0000 UTC m=+250.439307851" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.136445 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-krtbj" podStartSLOduration=179.136418014 podStartE2EDuration="2m59.136418014s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:52.091090029 +0000 UTC m=+250.440035581" watchObservedRunningTime="2026-03-19 10:25:52.136418014 +0000 UTC m=+250.485363556" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.167673 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.167867 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.667836681 +0000 UTC m=+251.016782223 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.168006 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.168110 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.168544 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.668528339 +0000 UTC m=+251.017473881 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.195423 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.229900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/ab39cf0a-a301-484b-9328-19acff8edae9-metrics-certs\") pod \"network-metrics-daemon-t8k4k\" (UID: \"ab39cf0a-a301-484b-9328-19acff8edae9\") " pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.240532 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-xz4gd" podStartSLOduration=179.240491991 podStartE2EDuration="2m59.240491991s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:52.167381618 +0000 UTC m=+250.516327160" watchObservedRunningTime="2026-03-19 10:25:52.240491991 +0000 UTC m=+250.589437523" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.270551 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.271101 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.771080954 +0000 UTC m=+251.120026496 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.300383 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.307517 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-t8k4k" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.353198 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" event={"ID":"7bb184cb-8063-44da-9eb4-64cc23c9b1f4","Type":"ContainerStarted","Data":"5f22fd21fef6e2c17153247a2c9d7c19b03b520deb357b1c5c56fe4908d377fc"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.356609 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-s58pr" podStartSLOduration=179.356581085 podStartE2EDuration="2m59.356581085s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:52.243359209 +0000 UTC m=+250.592304741" watchObservedRunningTime="2026-03-19 10:25:52.356581085 +0000 UTC m=+250.705526627" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.372676 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.373194 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.873174847 +0000 UTC m=+251.222120389 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.388542 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" event={"ID":"3a1f010c-6208-44bd-b36d-95140eaa0cd7","Type":"ContainerStarted","Data":"139db1c11ea5369208f6f2a3cdbb81e00500650055aa09a58baf0f6d64ad4a2a"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.388931 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.427404 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" event={"ID":"900ba5b7-85f6-4924-af0b-61efc2c8598a","Type":"ContainerStarted","Data":"64aeeba98e3b0121ca0ace1f348b284e98e95de94409be122da75a53df1d3f7a"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.474089 4765 generic.go:334] "Generic (PLEG): container finished" podID="7b848589-8f09-4ff0-b6cc-1e92be8c5c80" containerID="f38159f53a328f647161dc86fe221ff09afd0d13de8cd1c60cb92b1a6d712eb0" exitCode=0 Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.474566 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" event={"ID":"7b848589-8f09-4ff0-b6cc-1e92be8c5c80","Type":"ContainerDied","Data":"f38159f53a328f647161dc86fe221ff09afd0d13de8cd1c60cb92b1a6d712eb0"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.474837 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.475826 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.476517 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:52.976483522 +0000 UTC m=+251.325429064 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.552016 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" event={"ID":"401e164a-fc29-412f-ab6e-1c911f6c2d0a","Type":"ContainerStarted","Data":"49370671216f6890dc3fd1909c04b4aef18426322b984b9300e8dafc51a40f72"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.577240 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-serving-cert\") pod \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.577293 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-config\") pod \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.577508 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-client-ca\") pod \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.577562 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nkbcm\" (UniqueName: \"kubernetes.io/projected/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-kube-api-access-nkbcm\") pod \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\" (UID: \"7b848589-8f09-4ff0-b6cc-1e92be8c5c80\") " Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.577942 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.578392 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.078373549 +0000 UTC m=+251.427319101 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.579632 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-client-ca" (OuterVolumeSpecName: "client-ca") pod "7b848589-8f09-4ff0-b6cc-1e92be8c5c80" (UID: "7b848589-8f09-4ff0-b6cc-1e92be8c5c80"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.581045 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-config" (OuterVolumeSpecName: "config") pod "7b848589-8f09-4ff0-b6cc-1e92be8c5c80" (UID: "7b848589-8f09-4ff0-b6cc-1e92be8c5c80"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.594343 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-kube-api-access-nkbcm" (OuterVolumeSpecName: "kube-api-access-nkbcm") pod "7b848589-8f09-4ff0-b6cc-1e92be8c5c80" (UID: "7b848589-8f09-4ff0-b6cc-1e92be8c5c80"). InnerVolumeSpecName "kube-api-access-nkbcm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.599495 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7b848589-8f09-4ff0-b6cc-1e92be8c5c80" (UID: "7b848589-8f09-4ff0-b6cc-1e92be8c5c80"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.599753 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" event={"ID":"24b9c7be-22c4-4959-9332-d06229dd3371","Type":"ContainerStarted","Data":"ec820000de5663e61edac6787a115bf0cfd0d28fc883d1f7321902543f9a5ca0"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.636374 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" event={"ID":"23d53283-d76c-40c3-804e-73fc3431ed98","Type":"ContainerStarted","Data":"f34fe6332551a588916fab10c764c83db1f12478fb700ef6ec9e138cd813b4d1"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.674182 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" event={"ID":"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604","Type":"ContainerStarted","Data":"b426702385848586394eae441a2dadfc43ae0951398d25f0ae8b3e67280d0e46"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.674495 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" event={"ID":"736dd712-8f96-4b9c-bf2b-2f3eb3d4a604","Type":"ContainerStarted","Data":"a0b2ad83f5b25eaccc9e24129bf0b0eb41619ecaf71bacf7cde0885bbc765e50"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.678652 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.679038 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.679068 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nkbcm\" (UniqueName: \"kubernetes.io/projected/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-kube-api-access-nkbcm\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.679085 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.679097 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7b848589-8f09-4ff0-b6cc-1e92be8c5c80-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.679658 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.179627669 +0000 UTC m=+251.528573211 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.686747 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-tk8m5" event={"ID":"93df02a3-8614-47ff-a1ed-9592ef47d84e","Type":"ContainerStarted","Data":"19fe23c401fb553e3e820e16d9d5797fe9b217e0c9034520b731a63d57d72ceb"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.687411 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-tk8m5" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.703341 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" event={"ID":"6f6b2566-55ba-4df7-be75-afdc03f5ea73","Type":"ContainerStarted","Data":"e1bdb259eb5c8b20f8bbeca1409261310c012564faaf5db05454c9d56c54d3e1"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.746103 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" event={"ID":"3bde734c-df56-471b-8a70-2f555a974e57","Type":"ContainerStarted","Data":"4e6a20ea107d0d508ffbcfc4dd6fec69ac6f64ab1143060f43c060e48f12c90d"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.746182 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" event={"ID":"3bde734c-df56-471b-8a70-2f555a974e57","Type":"ContainerStarted","Data":"af8ef70ccb616e6e3ea1715d865e8d0b5e9e5b4e428355390785611b1e205bf2"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.771031 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" event={"ID":"fc39b56a-2b78-4c24-9f99-e1357d76b391","Type":"ContainerStarted","Data":"1cb5d7c0141eb838129f994ed6d90b77d2eb7e457fcf6c2de2627bf219e92a85"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.788052 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.789726 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.289703748 +0000 UTC m=+251.638649280 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.796677 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" event={"ID":"761e2822-68b0-4ea8-ada6-80f8ce6dec21","Type":"ContainerStarted","Data":"c457a51e5f41342690f33d3d1540594a4a07e78e27cec04dd903c6deeb772d7b"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.796741 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" event={"ID":"761e2822-68b0-4ea8-ada6-80f8ce6dec21","Type":"ContainerStarted","Data":"d4fb862f85b1f992c84d4e840c2988a51ede1278920a5230a90ebcd8d1e2043d"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.860867 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-phgbf" podStartSLOduration=179.860844167 podStartE2EDuration="2m59.860844167s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:52.856232461 +0000 UTC m=+251.205178003" watchObservedRunningTime="2026-03-19 10:25:52.860844167 +0000 UTC m=+251.209789709" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.876319 4765 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6wpdr container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" start-of-body= Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.876419 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.35:8080/healthz\": dial tcp 10.217.0.35:8080: connect: connection refused" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.877061 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" event={"ID":"bb55d3af-2526-425e-8a0a-25b779589866","Type":"ContainerStarted","Data":"a3e175904da7eb5583a4d6e49867a141e3413f058651e895a1806d9d6ebb3860"} Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.879207 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-jnmk9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.882180 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jnmk9" podUID="4980eaf1-2428-41fe-8a4f-052aace46947" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.882703 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.882083 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" podUID="72f5edb0-c000-4e80-b27d-d0d6023510f8" containerName="controller-manager" containerID="cri-o://e4cbb1057b82fe99c1f796e131ad41f1a7123b069737d175ff6f2885b2d3c1c3" gracePeriod=30 Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.891491 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.939998 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:52 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:52 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:52 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.940419 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-2lqmb" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.940719 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.946234 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.970010 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.469975851 +0000 UTC m=+251.818921393 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.977132 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh"] Mar 19 10:25:52 crc kubenswrapper[4765]: E0319 10:25:52.977381 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b848589-8f09-4ff0-b6cc-1e92be8c5c80" containerName="route-controller-manager" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.977394 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b848589-8f09-4ff0-b6cc-1e92be8c5c80" containerName="route-controller-manager" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.977511 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b848589-8f09-4ff0-b6cc-1e92be8c5c80" containerName="route-controller-manager" Mar 19 10:25:52 crc kubenswrapper[4765]: I0319 10:25:52.977920 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.009628 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh"] Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.013779 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-5bjx5" podStartSLOduration=180.013744514 podStartE2EDuration="3m0.013744514s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.012310445 +0000 UTC m=+251.361255987" watchObservedRunningTime="2026-03-19 10:25:53.013744514 +0000 UTC m=+251.362690056" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.014437 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.014791 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.514774972 +0000 UTC m=+251.863720514 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.074238 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-tk8m5" podStartSLOduration=10.074202541 podStartE2EDuration="10.074202541s" podCreationTimestamp="2026-03-19 10:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.071409405 +0000 UTC m=+251.420354947" watchObservedRunningTime="2026-03-19 10:25:53.074202541 +0000 UTC m=+251.423148083" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.101971 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" podStartSLOduration=181.101933977 podStartE2EDuration="3m1.101933977s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.1016553 +0000 UTC m=+251.450600842" watchObservedRunningTime="2026-03-19 10:25:53.101933977 +0000 UTC m=+251.450879519" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.137764 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.138075 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-config\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.138119 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc4tw\" (UniqueName: \"kubernetes.io/projected/c56525a8-f5a8-4c82-9c27-27da84cc5b63-kube-api-access-pc4tw\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.138165 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56525a8-f5a8-4c82-9c27-27da84cc5b63-serving-cert\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.138190 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-client-ca\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.138358 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.638338229 +0000 UTC m=+251.987283771 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.221188 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-28fs8" podStartSLOduration=180.221161067 podStartE2EDuration="3m0.221161067s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.205158791 +0000 UTC m=+251.554104333" watchObservedRunningTime="2026-03-19 10:25:53.221161067 +0000 UTC m=+251.570106609" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.240932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-config\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.241037 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pc4tw\" (UniqueName: \"kubernetes.io/projected/c56525a8-f5a8-4c82-9c27-27da84cc5b63-kube-api-access-pc4tw\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.241100 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56525a8-f5a8-4c82-9c27-27da84cc5b63-serving-cert\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.241131 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-client-ca\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.241198 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.241611 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.741593193 +0000 UTC m=+252.090538735 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.243073 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-config\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.243711 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-client-ca\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.264348 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56525a8-f5a8-4c82-9c27-27da84cc5b63-serving-cert\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.314763 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc4tw\" (UniqueName: \"kubernetes.io/projected/c56525a8-f5a8-4c82-9c27-27da84cc5b63-kube-api-access-pc4tw\") pod \"route-controller-manager-5d9546575c-z4vjh\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.329467 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-sw5kt" podStartSLOduration=180.329444768 podStartE2EDuration="3m0.329444768s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.287128044 +0000 UTC m=+251.636073596" watchObservedRunningTime="2026-03-19 10:25:53.329444768 +0000 UTC m=+251.678390310" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.333023 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-t8k4k"] Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.342134 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.342715 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.842698549 +0000 UTC m=+252.191644091 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.385398 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.403260 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-7jw7x" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.416045 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" podStartSLOduration=180.416023137 podStartE2EDuration="3m0.416023137s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.367260718 +0000 UTC m=+251.716206280" watchObservedRunningTime="2026-03-19 10:25:53.416023137 +0000 UTC m=+251.764968679" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.445028 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.445499 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:53.94548417 +0000 UTC m=+252.294429712 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.479803 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-j8smv" podStartSLOduration=180.479776215 podStartE2EDuration="3m0.479776215s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.470453961 +0000 UTC m=+251.819399503" watchObservedRunningTime="2026-03-19 10:25:53.479776215 +0000 UTC m=+251.828721757" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.480269 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-qfvdv" podStartSLOduration=180.480250768 podStartE2EDuration="3m0.480250768s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.418562327 +0000 UTC m=+251.767507869" watchObservedRunningTime="2026-03-19 10:25:53.480250768 +0000 UTC m=+251.829196310" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.536522 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-2gjzh" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.548214 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.548873 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.048851157 +0000 UTC m=+252.397796699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.569310 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" podStartSLOduration=181.569292835 podStartE2EDuration="3m1.569292835s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.554157192 +0000 UTC m=+251.903102744" watchObservedRunningTime="2026-03-19 10:25:53.569292835 +0000 UTC m=+251.918238377" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.631519 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-zs4v4" podStartSLOduration=180.631480629 podStartE2EDuration="3m0.631480629s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.62930285 +0000 UTC m=+251.978248392" watchObservedRunningTime="2026-03-19 10:25:53.631480629 +0000 UTC m=+251.980426171" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.650307 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.651137 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.151117435 +0000 UTC m=+252.500062977 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.689192 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb8a97d83_18b0_42eb_9ed9_f49ffff3d034.slice/crio-c2a3fc499763766084e1f741340244299ced9cfe6bea53084e18f4fc9d4e9a8a.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.752850 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.753765 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.253738201 +0000 UTC m=+252.602683743 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.792233 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-gvklb" podStartSLOduration=180.79221516 podStartE2EDuration="3m0.79221516s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:53.790602646 +0000 UTC m=+252.139548188" watchObservedRunningTime="2026-03-19 10:25:53.79221516 +0000 UTC m=+252.141160702" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.854898 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.855462 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.355442373 +0000 UTC m=+252.704387915 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.879357 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:53 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:53 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:53 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.879437 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.921307 4765 generic.go:334] "Generic (PLEG): container finished" podID="72f5edb0-c000-4e80-b27d-d0d6023510f8" containerID="e4cbb1057b82fe99c1f796e131ad41f1a7123b069737d175ff6f2885b2d3c1c3" exitCode=0 Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.921398 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" event={"ID":"72f5edb0-c000-4e80-b27d-d0d6023510f8","Type":"ContainerDied","Data":"e4cbb1057b82fe99c1f796e131ad41f1a7123b069737d175ff6f2885b2d3c1c3"} Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.946209 4765 generic.go:334] "Generic (PLEG): container finished" podID="b8a97d83-18b0-42eb-9ed9-f49ffff3d034" containerID="c2a3fc499763766084e1f741340244299ced9cfe6bea53084e18f4fc9d4e9a8a" exitCode=0 Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.946387 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" event={"ID":"b8a97d83-18b0-42eb-9ed9-f49ffff3d034","Type":"ContainerDied","Data":"c2a3fc499763766084e1f741340244299ced9cfe6bea53084e18f4fc9d4e9a8a"} Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.959936 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:53 crc kubenswrapper[4765]: E0319 10:25:53.960542 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.460521037 +0000 UTC m=+252.809466579 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.961166 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" event={"ID":"ab39cf0a-a301-484b-9328-19acff8edae9","Type":"ContainerStarted","Data":"0bdf05194336b90e4fbfe2326792f73f92e4a85e7f844965a230fcad69574501"} Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.975032 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.980265 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w" event={"ID":"7b848589-8f09-4ff0-b6cc-1e92be8c5c80","Type":"ContainerDied","Data":"0e9ae14d5d9916585ecc1c92cd6b7c4759fdc42590c2e564f6845d7850275328"} Mar 19 10:25:53 crc kubenswrapper[4765]: I0319 10:25:53.980373 4765 scope.go:117] "RemoveContainer" containerID="f38159f53a328f647161dc86fe221ff09afd0d13de8cd1c60cb92b1a6d712eb0" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.062633 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.072336 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.572314594 +0000 UTC m=+252.921260136 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.137228 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w"] Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.146609 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-ptr8w"] Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.152346 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.164882 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.165787 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.665766721 +0000 UTC m=+253.014712263 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.264908 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh"] Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.267316 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-proxy-ca-bundles\") pod \"72f5edb0-c000-4e80-b27d-d0d6023510f8\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.267403 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f5edb0-c000-4e80-b27d-d0d6023510f8-serving-cert\") pod \"72f5edb0-c000-4e80-b27d-d0d6023510f8\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.267484 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-config\") pod \"72f5edb0-c000-4e80-b27d-d0d6023510f8\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.267554 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tpwq\" (UniqueName: \"kubernetes.io/projected/72f5edb0-c000-4e80-b27d-d0d6023510f8-kube-api-access-6tpwq\") pod \"72f5edb0-c000-4e80-b27d-d0d6023510f8\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.267590 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-client-ca\") pod \"72f5edb0-c000-4e80-b27d-d0d6023510f8\" (UID: \"72f5edb0-c000-4e80-b27d-d0d6023510f8\") " Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.268134 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.268408 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "72f5edb0-c000-4e80-b27d-d0d6023510f8" (UID: "72f5edb0-c000-4e80-b27d-d0d6023510f8"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.268998 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-config" (OuterVolumeSpecName: "config") pod "72f5edb0-c000-4e80-b27d-d0d6023510f8" (UID: "72f5edb0-c000-4e80-b27d-d0d6023510f8"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.270034 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.770012652 +0000 UTC m=+253.118958394 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.270773 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-client-ca" (OuterVolumeSpecName: "client-ca") pod "72f5edb0-c000-4e80-b27d-d0d6023510f8" (UID: "72f5edb0-c000-4e80-b27d-d0d6023510f8"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.290696 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72f5edb0-c000-4e80-b27d-d0d6023510f8-kube-api-access-6tpwq" (OuterVolumeSpecName: "kube-api-access-6tpwq") pod "72f5edb0-c000-4e80-b27d-d0d6023510f8" (UID: "72f5edb0-c000-4e80-b27d-d0d6023510f8"). InnerVolumeSpecName "kube-api-access-6tpwq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.310844 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72f5edb0-c000-4e80-b27d-d0d6023510f8-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "72f5edb0-c000-4e80-b27d-d0d6023510f8" (UID: "72f5edb0-c000-4e80-b27d-d0d6023510f8"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.369370 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b848589-8f09-4ff0-b6cc-1e92be8c5c80" path="/var/lib/kubelet/pods/7b848589-8f09-4ff0-b6cc-1e92be8c5c80/volumes" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.370449 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.370840 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.370854 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tpwq\" (UniqueName: \"kubernetes.io/projected/72f5edb0-c000-4e80-b27d-d0d6023510f8-kube-api-access-6tpwq\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.370863 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.370872 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/72f5edb0-c000-4e80-b27d-d0d6023510f8-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.370880 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/72f5edb0-c000-4e80-b27d-d0d6023510f8-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.371139 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.870937813 +0000 UTC m=+253.219883355 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.473723 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.474413 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:54.974388332 +0000 UTC m=+253.323333954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.576154 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.576317 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.076288899 +0000 UTC m=+253.425234441 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.576932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.577364 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.077342608 +0000 UTC m=+253.426288150 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.626501 4765 ???:1] "http: TLS handshake error from 192.168.126.11:35236: no serving certificate available for the kubelet" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.656045 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-rdfk6"] Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.656669 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="72f5edb0-c000-4e80-b27d-d0d6023510f8" containerName="controller-manager" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.656791 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="72f5edb0-c000-4e80-b27d-d0d6023510f8" containerName="controller-manager" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.657107 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="72f5edb0-c000-4e80-b27d-d0d6023510f8" containerName="controller-manager" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.658316 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.662196 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.668616 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdfk6"] Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.679367 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.680054 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.180034507 +0000 UTC m=+253.528980049 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.781357 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-catalog-content\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.781415 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-utilities\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.781461 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9s2p5\" (UniqueName: \"kubernetes.io/projected/bc196990-77bd-4e55-9380-1fa14ec297bf-kube-api-access-9s2p5\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.781487 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.781820 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.28180781 +0000 UTC m=+253.630753352 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.855909 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:54 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:54 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:54 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.856028 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.860611 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-84nts"] Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.870374 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.879516 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84nts"] Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.882836 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.883041 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.383015609 +0000 UTC m=+253.731961151 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.886275 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-catalog-content\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.886393 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-utilities\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.886484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9s2p5\" (UniqueName: \"kubernetes.io/projected/bc196990-77bd-4e55-9380-1fa14ec297bf-kube-api-access-9s2p5\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.886517 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.887170 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.387151141 +0000 UTC m=+253.736096673 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.887563 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-utilities\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.887897 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-catalog-content\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.889318 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.921632 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9s2p5\" (UniqueName: \"kubernetes.io/projected/bc196990-77bd-4e55-9380-1fa14ec297bf-kube-api-access-9s2p5\") pod \"certified-operators-rdfk6\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.979494 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.988024 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.988271 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.488207596 +0000 UTC m=+253.837153138 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.988463 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-catalog-content\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.988541 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-utilities\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.988579 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.988622 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7n6x4\" (UniqueName: \"kubernetes.io/projected/0e281996-1607-4eab-a87f-f4434f4dd17a-kube-api-access-7n6x4\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:54 crc kubenswrapper[4765]: E0319 10:25:54.989507 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.48948854 +0000 UTC m=+253.838434082 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.991061 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" event={"ID":"7bb184cb-8063-44da-9eb4-64cc23c9b1f4","Type":"ContainerStarted","Data":"6302cc187ae741a7ee971b29ca3120dfb14410de0055daf464fa9732841cad6f"} Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.994158 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" event={"ID":"ab39cf0a-a301-484b-9328-19acff8edae9","Type":"ContainerStarted","Data":"1e359df700d203f9e4df067df253bfa034d0639492304c523ed8c6259c507c07"} Mar 19 10:25:54 crc kubenswrapper[4765]: I0319 10:25:54.994180 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-t8k4k" event={"ID":"ab39cf0a-a301-484b-9328-19acff8edae9","Type":"ContainerStarted","Data":"601db614a5ac3d9c01956535ff90507257902d3426b1ef7e244a7a080e6ba3a3"} Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.015482 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" event={"ID":"c56525a8-f5a8-4c82-9c27-27da84cc5b63","Type":"ContainerStarted","Data":"96ab6990b126b3950e30ecdc7aace87fa80cf3f458a5b7e5195b2d89ddd61216"} Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.015526 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" event={"ID":"c56525a8-f5a8-4c82-9c27-27da84cc5b63","Type":"ContainerStarted","Data":"56ad1b253bb5582035e2d372c393f48ac5732012798321d67358e4f16c9f3614"} Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.015927 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.027610 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-t8k4k" podStartSLOduration=183.027584439 podStartE2EDuration="3m3.027584439s" podCreationTimestamp="2026-03-19 10:22:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:55.026124089 +0000 UTC m=+253.375069631" watchObservedRunningTime="2026-03-19 10:25:55.027584439 +0000 UTC m=+253.376529981" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.030332 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" event={"ID":"72f5edb0-c000-4e80-b27d-d0d6023510f8","Type":"ContainerDied","Data":"1620f78379a6434dd79ac5ac6c0de634083e8f3ccf4bf8e7ca5a12738d5e41b3"} Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.030388 4765 scope.go:117] "RemoveContainer" containerID="e4cbb1057b82fe99c1f796e131ad41f1a7123b069737d175ff6f2885b2d3c1c3" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.030723 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-r9spg" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.072579 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" podStartSLOduration=4.072556724 podStartE2EDuration="4.072556724s" podCreationTimestamp="2026-03-19 10:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:55.069226844 +0000 UTC m=+253.418172386" watchObservedRunningTime="2026-03-19 10:25:55.072556724 +0000 UTC m=+253.421502266" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.076306 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-bwctw"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.077628 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.087874 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwctw"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.091308 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.091700 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-utilities\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.091791 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7n6x4\" (UniqueName: \"kubernetes.io/projected/0e281996-1607-4eab-a87f-f4434f4dd17a-kube-api-access-7n6x4\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.091863 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-catalog-content\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.092420 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-catalog-content\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:55 crc kubenswrapper[4765]: E0319 10:25:55.092496 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.592480587 +0000 UTC m=+253.941426129 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.095107 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-utilities\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.101978 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r9spg"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.102132 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-r9spg"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.160803 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7n6x4\" (UniqueName: \"kubernetes.io/projected/0e281996-1607-4eab-a87f-f4434f4dd17a-kube-api-access-7n6x4\") pod \"community-operators-84nts\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.168121 4765 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.192554 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.193064 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x45zq\" (UniqueName: \"kubernetes.io/projected/926bf3fe-48b1-472a-93bd-da210e7ee945-kube-api-access-x45zq\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.193178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-utilities\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.193212 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-catalog-content\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.193304 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:55 crc kubenswrapper[4765]: E0319 10:25:55.193678 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.693659085 +0000 UTC m=+254.042604627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.269293 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-gf6md"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.270681 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.273559 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gf6md"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.297528 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.297910 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78gm5\" (UniqueName: \"kubernetes.io/projected/ae8ff072-c71b-412c-88a7-d834fbe98f10-kube-api-access-78gm5\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.297983 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x45zq\" (UniqueName: \"kubernetes.io/projected/926bf3fe-48b1-472a-93bd-da210e7ee945-kube-api-access-x45zq\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.298035 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-utilities\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.298076 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-catalog-content\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: E0319 10:25:55.298322 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.798268486 +0000 UTC m=+254.147214178 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.298506 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-utilities\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.298542 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-catalog-content\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.298613 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:55 crc kubenswrapper[4765]: E0319 10:25:55.298968 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.798933264 +0000 UTC m=+254.147878806 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.300096 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-utilities\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.300385 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-catalog-content\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.335285 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x45zq\" (UniqueName: \"kubernetes.io/projected/926bf3fe-48b1-472a-93bd-da210e7ee945-kube-api-access-x45zq\") pod \"certified-operators-bwctw\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.353196 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.354101 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.358558 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.358585 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.358688 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.358688 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.358837 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.358988 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.370908 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.388974 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399488 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:55 crc kubenswrapper[4765]: E0319 10:25:55.399679 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.899633218 +0000 UTC m=+254.248578760 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399816 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399856 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-client-ca\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399879 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78gm5\" (UniqueName: \"kubernetes.io/projected/ae8ff072-c71b-412c-88a7-d834fbe98f10-kube-api-access-78gm5\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399919 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhg9c\" (UniqueName: \"kubernetes.io/projected/81d42c8d-48dc-4545-b379-32616f6bbab1-kube-api-access-vhg9c\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399936 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-config\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399971 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-utilities\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.399998 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81d42c8d-48dc-4545-b379-32616f6bbab1-serving-cert\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.400026 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-catalog-content\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.400088 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-proxy-ca-bundles\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: E0319 10:25:55.400496 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-19 10:25:55.900488842 +0000 UTC m=+254.249434374 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-x94pq" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.400931 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-catalog-content\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.401043 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-utilities\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.404070 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.410528 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.428694 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78gm5\" (UniqueName: \"kubernetes.io/projected/ae8ff072-c71b-412c-88a7-d834fbe98f10-kube-api-access-78gm5\") pod \"community-operators-gf6md\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.462423 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.487168 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-rdfk6"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.513480 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-config-volume\") pod \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.513893 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.513938 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-secret-volume\") pod \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.514047 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5vd5\" (UniqueName: \"kubernetes.io/projected/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-kube-api-access-g5vd5\") pod \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\" (UID: \"b8a97d83-18b0-42eb-9ed9-f49ffff3d034\") " Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.514319 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhg9c\" (UniqueName: \"kubernetes.io/projected/81d42c8d-48dc-4545-b379-32616f6bbab1-kube-api-access-vhg9c\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.514350 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-config\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.514380 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81d42c8d-48dc-4545-b379-32616f6bbab1-serving-cert\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.514481 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-proxy-ca-bundles\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.514552 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-client-ca\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.515710 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-client-ca\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.516565 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-config-volume" (OuterVolumeSpecName: "config-volume") pod "b8a97d83-18b0-42eb-9ed9-f49ffff3d034" (UID: "b8a97d83-18b0-42eb-9ed9-f49ffff3d034"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:25:55 crc kubenswrapper[4765]: E0319 10:25:55.516673 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-19 10:25:56.016651528 +0000 UTC m=+254.365597070 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.520043 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-config\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.535217 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-kube-api-access-g5vd5" (OuterVolumeSpecName: "kube-api-access-g5vd5") pod "b8a97d83-18b0-42eb-9ed9-f49ffff3d034" (UID: "b8a97d83-18b0-42eb-9ed9-f49ffff3d034"). InnerVolumeSpecName "kube-api-access-g5vd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.535285 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-proxy-ca-bundles\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.539241 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81d42c8d-48dc-4545-b379-32616f6bbab1-serving-cert\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.547573 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b8a97d83-18b0-42eb-9ed9-f49ffff3d034" (UID: "b8a97d83-18b0-42eb-9ed9-f49ffff3d034"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.557875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhg9c\" (UniqueName: \"kubernetes.io/projected/81d42c8d-48dc-4545-b379-32616f6bbab1-kube-api-access-vhg9c\") pod \"controller-manager-6fbd86f94d-n5h4g\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.571419 4765 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-19T10:25:55.16816155Z","Handler":null,"Name":""} Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.586546 4765 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.586609 4765 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.617587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.617675 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.617698 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.617711 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5vd5\" (UniqueName: \"kubernetes.io/projected/b8a97d83-18b0-42eb-9ed9-f49ffff3d034-kube-api-access-g5vd5\") on node \"crc\" DevicePath \"\"" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.622780 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-84nts"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.625807 4765 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.625864 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.647770 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.651826 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-x94pq\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.674574 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.717793 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.719401 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.753634 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.798076 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-bwctw"] Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.855456 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:55 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:55 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:55 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:55 crc kubenswrapper[4765]: I0319 10:25:55.855522 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.013024 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.014664 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.024798 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.024881 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.033572 4765 patch_prober.go:28] interesting pod/console-f9d7485db-94dnk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.033640 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-94dnk" podUID="39658af6-59cf-48c7-9015-2271021bd64e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.056258 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g"] Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.062578 4765 patch_prober.go:28] interesting pod/apiserver-76f77b778f-56zlb container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[+]ping ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]log ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]etcd ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/start-apiserver-admission-initializer ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/generic-apiserver-start-informers ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/max-in-flight-filter ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/storage-object-count-tracker-hook ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/image.openshift.io-apiserver-caches ok Mar 19 10:25:56 crc kubenswrapper[4765]: [-]poststarthook/authorization.openshift.io-bootstrapclusterroles failed: reason withheld Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/authorization.openshift.io-ensurenodebootstrap-sa ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/project.openshift.io-projectcache ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/project.openshift.io-projectauthorizationcache ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/openshift.io-startinformers ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/openshift.io-restmapperupdater ok Mar 19 10:25:56 crc kubenswrapper[4765]: [+]poststarthook/quota.openshift.io-clusterquotamapping ok Mar 19 10:25:56 crc kubenswrapper[4765]: livez check failed Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.062654 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" podUID="3bde734c-df56-471b-8a70-2f555a974e57" containerName="openshift-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.079818 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.080443 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v" event={"ID":"b8a97d83-18b0-42eb-9ed9-f49ffff3d034","Type":"ContainerDied","Data":"d4d1c451020ab37fdc094102f89fe0859a6463287c23990490b782119d3199a6"} Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.080511 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d4d1c451020ab37fdc094102f89fe0859a6463287c23990490b782119d3199a6" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.104944 4765 generic.go:334] "Generic (PLEG): container finished" podID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerID="0678fb96ef88d0a4aeba650a68a14b4d470047d770487d64a08feeaf8ed63f46" exitCode=0 Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.105337 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdfk6" event={"ID":"bc196990-77bd-4e55-9380-1fa14ec297bf","Type":"ContainerDied","Data":"0678fb96ef88d0a4aeba650a68a14b4d470047d770487d64a08feeaf8ed63f46"} Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.105398 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdfk6" event={"ID":"bc196990-77bd-4e55-9380-1fa14ec297bf","Type":"ContainerStarted","Data":"43c336263b1c5e7c25f02a2c288d7781a084db7a8cb47fa3ce0f2c9b8fce5e73"} Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.119565 4765 generic.go:334] "Generic (PLEG): container finished" podID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerID="774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7" exitCode=0 Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.120207 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84nts" event={"ID":"0e281996-1607-4eab-a87f-f4434f4dd17a","Type":"ContainerDied","Data":"774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7"} Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.120301 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84nts" event={"ID":"0e281996-1607-4eab-a87f-f4434f4dd17a","Type":"ContainerStarted","Data":"194763c947fddaa6e5095ed5c794da435f48b0a727ddbe0576af3fec2fe414bf"} Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.127742 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x94pq"] Mar 19 10:25:56 crc kubenswrapper[4765]: W0319 10:25:56.147484 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a73adb_452b_4db3_9bc2_3411d1575eb5.slice/crio-3a4e30cce45f9fbeafe7d5650bffe86e10ebac6bb3acb8c1227f488948fa0bcc WatchSource:0}: Error finding container 3a4e30cce45f9fbeafe7d5650bffe86e10ebac6bb3acb8c1227f488948fa0bcc: Status 404 returned error can't find the container with id 3a4e30cce45f9fbeafe7d5650bffe86e10ebac6bb3acb8c1227f488948fa0bcc Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.152849 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwctw" event={"ID":"926bf3fe-48b1-472a-93bd-da210e7ee945","Type":"ContainerStarted","Data":"6326d376af9fd4a90a4483df1c303f9c15c334035fac5a6d1044a240c9870172"} Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.162761 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" event={"ID":"7bb184cb-8063-44da-9eb4-64cc23c9b1f4","Type":"ContainerStarted","Data":"d33d52e07cec67ba96b3aee992d7e629de8b11334eef80204f5e38689697b107"} Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.214667 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 10:25:56 crc kubenswrapper[4765]: E0319 10:25:56.214990 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b8a97d83-18b0-42eb-9ed9-f49ffff3d034" containerName="collect-profiles" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.215007 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b8a97d83-18b0-42eb-9ed9-f49ffff3d034" containerName="collect-profiles" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.215180 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b8a97d83-18b0-42eb-9ed9-f49ffff3d034" containerName="collect-profiles" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.215709 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.228587 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.233063 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.234170 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.306552 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-gf6md"] Mar 19 10:25:56 crc kubenswrapper[4765]: W0319 10:25:56.324639 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae8ff072_c71b_412c_88a7_d834fbe98f10.slice/crio-5449dfff0affcf2be11e4fd48c8a979503daaca962f185bfa21a47fa54d83287 WatchSource:0}: Error finding container 5449dfff0affcf2be11e4fd48c8a979503daaca962f185bfa21a47fa54d83287: Status 404 returned error can't find the container with id 5449dfff0affcf2be11e4fd48c8a979503daaca962f185bfa21a47fa54d83287 Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.333370 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.334539 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.336230 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdd2bba4-0982-4141-aebe-7abe2af369c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.336349 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdd2bba4-0982-4141-aebe-7abe2af369c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.337652 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.337884 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.348830 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.385867 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72f5edb0-c000-4e80-b27d-d0d6023510f8" path="/var/lib/kubelet/pods/72f5edb0-c000-4e80-b27d-d0d6023510f8/volumes" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.388740 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.438698 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdd2bba4-0982-4141-aebe-7abe2af369c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.438826 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7007dbf4-9e41-40ae-b2d0-6f7855039174-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.438893 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdd2bba4-0982-4141-aebe-7abe2af369c5-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.439012 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7007dbf4-9e41-40ae-b2d0-6f7855039174-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.439346 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdd2bba4-0982-4141-aebe-7abe2af369c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.463048 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdd2bba4-0982-4141-aebe-7abe2af369c5-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.542683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7007dbf4-9e41-40ae-b2d0-6f7855039174-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.542766 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7007dbf4-9e41-40ae-b2d0-6f7855039174-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.543000 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7007dbf4-9e41-40ae-b2d0-6f7855039174-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.569949 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7007dbf4-9e41-40ae-b2d0-6f7855039174-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.649109 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-8mm2f"] Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.650436 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.652297 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.656323 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.668562 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mm2f"] Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.679539 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.706478 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-jnmk9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.706534 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jnmk9" podUID="4980eaf1-2428-41fe-8a4f-052aace46947" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.707138 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-jnmk9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.707217 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jnmk9" podUID="4980eaf1-2428-41fe-8a4f-052aace46947" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.748077 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-catalog-content\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.748150 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l6zzm\" (UniqueName: \"kubernetes.io/projected/49e1c321-1087-47b4-a9ef-446e4cef558e-kube-api-access-l6zzm\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.748218 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-utilities\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.789695 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.848487 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.849085 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-utilities\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.849245 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-catalog-content\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.849275 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l6zzm\" (UniqueName: \"kubernetes.io/projected/49e1c321-1087-47b4-a9ef-446e4cef558e-kube-api-access-l6zzm\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.850736 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-catalog-content\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.850802 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-utilities\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.852755 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:56 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:56 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:56 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.852837 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.886046 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l6zzm\" (UniqueName: \"kubernetes.io/projected/49e1c321-1087-47b4-a9ef-446e4cef558e-kube-api-access-l6zzm\") pod \"redhat-marketplace-8mm2f\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:56 crc kubenswrapper[4765]: I0319 10:25:56.969493 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.057551 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rvq7h"] Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.058891 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.077513 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvq7h"] Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.159570 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-catalog-content\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.159611 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-utilities\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.159659 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nngkf\" (UniqueName: \"kubernetes.io/projected/4f3a6d2d-3226-4add-983c-2d3574217f12-kube-api-access-nngkf\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.181860 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" event={"ID":"7bb184cb-8063-44da-9eb4-64cc23c9b1f4","Type":"ContainerStarted","Data":"21e8c15a2327449b520ca06b6adc94e0ca5c3d4f9335be127909e8e179c0d587"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.184350 4765 generic.go:334] "Generic (PLEG): container finished" podID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerID="9b1a73a2603b1f7bbd4e0140dbb1d27960a836580520723d6bc82d12a054d229" exitCode=0 Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.184412 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6md" event={"ID":"ae8ff072-c71b-412c-88a7-d834fbe98f10","Type":"ContainerDied","Data":"9b1a73a2603b1f7bbd4e0140dbb1d27960a836580520723d6bc82d12a054d229"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.184438 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6md" event={"ID":"ae8ff072-c71b-412c-88a7-d834fbe98f10","Type":"ContainerStarted","Data":"5449dfff0affcf2be11e4fd48c8a979503daaca962f185bfa21a47fa54d83287"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.188043 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" event={"ID":"81d42c8d-48dc-4545-b379-32616f6bbab1","Type":"ContainerStarted","Data":"22e75b8d2fce71e2351188e5c71f8bc48ba75ef123d9c6f1a31c932cb6907bcf"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.188134 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" event={"ID":"81d42c8d-48dc-4545-b379-32616f6bbab1","Type":"ContainerStarted","Data":"c19491cb7c1f3da5281cf5a170f646b0ea63c74b3624db63fc7710c4c334ba63"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.188188 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.194105 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" event={"ID":"54a73adb-452b-4db3-9bc2-3411d1575eb5","Type":"ContainerStarted","Data":"375b68b916b3cbf45f22c93be49ba7ca4585c492f6ed54e350a958d1f07a4aac"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.194153 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" event={"ID":"54a73adb-452b-4db3-9bc2-3411d1575eb5","Type":"ContainerStarted","Data":"3a4e30cce45f9fbeafe7d5650bffe86e10ebac6bb3acb8c1227f488948fa0bcc"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.194705 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.195106 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.201375 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-fl4n5" podStartSLOduration=14.201352382 podStartE2EDuration="14.201352382s" podCreationTimestamp="2026-03-19 10:25:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:57.200741545 +0000 UTC m=+255.549687097" watchObservedRunningTime="2026-03-19 10:25:57.201352382 +0000 UTC m=+255.550297924" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.213434 4765 generic.go:334] "Generic (PLEG): container finished" podID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerID="8ce3548ec8213cf2ee2ffb8fa751b5a710f9a2b3b703e74fc32d49dd1b074fe3" exitCode=0 Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.214127 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwctw" event={"ID":"926bf3fe-48b1-472a-93bd-da210e7ee945","Type":"ContainerDied","Data":"8ce3548ec8213cf2ee2ffb8fa751b5a710f9a2b3b703e74fc32d49dd1b074fe3"} Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.272596 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-catalog-content\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.273596 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-utilities\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.275492 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nngkf\" (UniqueName: \"kubernetes.io/projected/4f3a6d2d-3226-4add-983c-2d3574217f12-kube-api-access-nngkf\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.279533 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-utilities\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.284206 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-catalog-content\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.326252 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.327923 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" podStartSLOduration=6.327909731 podStartE2EDuration="6.327909731s" podCreationTimestamp="2026-03-19 10:25:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:57.294327355 +0000 UTC m=+255.643272897" watchObservedRunningTime="2026-03-19 10:25:57.327909731 +0000 UTC m=+255.676855273" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.352779 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nngkf\" (UniqueName: \"kubernetes.io/projected/4f3a6d2d-3226-4add-983c-2d3574217f12-kube-api-access-nngkf\") pod \"redhat-marketplace-rvq7h\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.354072 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" podStartSLOduration=184.354042113 podStartE2EDuration="3m4.354042113s" podCreationTimestamp="2026-03-19 10:22:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:57.344818772 +0000 UTC m=+255.693764314" watchObservedRunningTime="2026-03-19 10:25:57.354042113 +0000 UTC m=+255.702987655" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.385736 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.481627 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.736456 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mm2f"] Mar 19 10:25:57 crc kubenswrapper[4765]: W0319 10:25:57.750889 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod49e1c321_1087_47b4_a9ef_446e4cef558e.slice/crio-6cc07598dcd37af7e9f50607e5924fa1560aa41549d1e306d6015fdfc1721caa WatchSource:0}: Error finding container 6cc07598dcd37af7e9f50607e5924fa1560aa41549d1e306d6015fdfc1721caa: Status 404 returned error can't find the container with id 6cc07598dcd37af7e9f50607e5924fa1560aa41549d1e306d6015fdfc1721caa Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.853844 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:57 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:57 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:57 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.853925 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.855360 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9ddtf"] Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.857528 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.866417 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.892692 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ddtf"] Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.944145 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvq7h"] Mar 19 10:25:57 crc kubenswrapper[4765]: W0319 10:25:57.962845 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4f3a6d2d_3226_4add_983c_2d3574217f12.slice/crio-72c2ab8993a0db5915bf4f79b30fea2fa1eaf10c4e732dfb7c43d7e3bd26b117 WatchSource:0}: Error finding container 72c2ab8993a0db5915bf4f79b30fea2fa1eaf10c4e732dfb7c43d7e3bd26b117: Status 404 returned error can't find the container with id 72c2ab8993a0db5915bf4f79b30fea2fa1eaf10c4e732dfb7c43d7e3bd26b117 Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.994675 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-catalog-content\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.994755 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-utilities\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:57 crc kubenswrapper[4765]: I0319 10:25:57.994902 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qv5lb\" (UniqueName: \"kubernetes.io/projected/107a869c-7528-417c-a633-e775a88a3cea-kube-api-access-qv5lb\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.096384 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-catalog-content\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.096462 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-utilities\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.096536 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qv5lb\" (UniqueName: \"kubernetes.io/projected/107a869c-7528-417c-a633-e775a88a3cea-kube-api-access-qv5lb\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.097426 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-catalog-content\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.097992 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-utilities\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.120011 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qv5lb\" (UniqueName: \"kubernetes.io/projected/107a869c-7528-417c-a633-e775a88a3cea-kube-api-access-qv5lb\") pod \"redhat-operators-9ddtf\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.196421 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.246155 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7007dbf4-9e41-40ae-b2d0-6f7855039174","Type":"ContainerStarted","Data":"312017a95ba3d0c13c38925c91bfc8b097e718dcfe158543758c1e1c5ae1074f"} Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.246230 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7007dbf4-9e41-40ae-b2d0-6f7855039174","Type":"ContainerStarted","Data":"d42ab8ca1826bac841f343cb6cb4944eb929469e780e60e95cd4ed79d2757227"} Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.253394 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-cbxll"] Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.261113 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbxll"] Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.261248 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.291403 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cdd2bba4-0982-4141-aebe-7abe2af369c5","Type":"ContainerStarted","Data":"d6aa4cb21a5528bfa254a5eb7db754fdbbc966128e264b2d946f6151abfde073"} Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.291509 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cdd2bba4-0982-4141-aebe-7abe2af369c5","Type":"ContainerStarted","Data":"ad9ae08e3f2abd1d934ff637438c2126ca2dd7ea881236488d4912118273d840"} Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.294775 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=2.294759611 podStartE2EDuration="2.294759611s" podCreationTimestamp="2026-03-19 10:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:58.273499162 +0000 UTC m=+256.622444704" watchObservedRunningTime="2026-03-19 10:25:58.294759611 +0000 UTC m=+256.643705153" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.313651 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.313635236 podStartE2EDuration="2.313635236s" podCreationTimestamp="2026-03-19 10:25:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:25:58.310158671 +0000 UTC m=+256.659104213" watchObservedRunningTime="2026-03-19 10:25:58.313635236 +0000 UTC m=+256.662580778" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.328342 4765 generic.go:334] "Generic (PLEG): container finished" podID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerID="d25f7d9991331b77ff9502465b20dcc74b66ae55514e873efe7b4b97ab8565d0" exitCode=0 Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.328544 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mm2f" event={"ID":"49e1c321-1087-47b4-a9ef-446e4cef558e","Type":"ContainerDied","Data":"d25f7d9991331b77ff9502465b20dcc74b66ae55514e873efe7b4b97ab8565d0"} Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.328809 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mm2f" event={"ID":"49e1c321-1087-47b4-a9ef-446e4cef558e","Type":"ContainerStarted","Data":"6cc07598dcd37af7e9f50607e5924fa1560aa41549d1e306d6015fdfc1721caa"} Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.368728 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvq7h" event={"ID":"4f3a6d2d-3226-4add-983c-2d3574217f12","Type":"ContainerStarted","Data":"72c2ab8993a0db5915bf4f79b30fea2fa1eaf10c4e732dfb7c43d7e3bd26b117"} Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.402675 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-catalog-content\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.402738 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-utilities\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.402903 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9t2ln\" (UniqueName: \"kubernetes.io/projected/021dd78e-84b2-410d-bb02-9919a7044f3e-kube-api-access-9t2ln\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.504912 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9t2ln\" (UniqueName: \"kubernetes.io/projected/021dd78e-84b2-410d-bb02-9919a7044f3e-kube-api-access-9t2ln\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.505291 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-catalog-content\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.505326 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-utilities\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.507292 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-utilities\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.510333 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-catalog-content\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.527194 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9t2ln\" (UniqueName: \"kubernetes.io/projected/021dd78e-84b2-410d-bb02-9919a7044f3e-kube-api-access-9t2ln\") pod \"redhat-operators-cbxll\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.643580 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.768252 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9ddtf"] Mar 19 10:25:58 crc kubenswrapper[4765]: W0319 10:25:58.798471 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod107a869c_7528_417c_a633_e775a88a3cea.slice/crio-f9d69d6a65a35f854845b2540e33624cda093d383f13f1d7a28bd87d81211cf6 WatchSource:0}: Error finding container f9d69d6a65a35f854845b2540e33624cda093d383f13f1d7a28bd87d81211cf6: Status 404 returned error can't find the container with id f9d69d6a65a35f854845b2540e33624cda093d383f13f1d7a28bd87d81211cf6 Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.854231 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:58 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:58 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:58 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:58 crc kubenswrapper[4765]: I0319 10:25:58.854384 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.012280 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-cbxll"] Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.378083 4765 generic.go:334] "Generic (PLEG): container finished" podID="107a869c-7528-417c-a633-e775a88a3cea" containerID="4bffe71520afa64173fd9fa5f8f8f8745818ba42d88cde640c76b44b74574a5c" exitCode=0 Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.378206 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ddtf" event={"ID":"107a869c-7528-417c-a633-e775a88a3cea","Type":"ContainerDied","Data":"4bffe71520afa64173fd9fa5f8f8f8745818ba42d88cde640c76b44b74574a5c"} Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.378617 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ddtf" event={"ID":"107a869c-7528-417c-a633-e775a88a3cea","Type":"ContainerStarted","Data":"f9d69d6a65a35f854845b2540e33624cda093d383f13f1d7a28bd87d81211cf6"} Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.387086 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerID="44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b" exitCode=0 Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.387170 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvq7h" event={"ID":"4f3a6d2d-3226-4add-983c-2d3574217f12","Type":"ContainerDied","Data":"44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b"} Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.406160 4765 generic.go:334] "Generic (PLEG): container finished" podID="7007dbf4-9e41-40ae-b2d0-6f7855039174" containerID="312017a95ba3d0c13c38925c91bfc8b097e718dcfe158543758c1e1c5ae1074f" exitCode=0 Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.406306 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7007dbf4-9e41-40ae-b2d0-6f7855039174","Type":"ContainerDied","Data":"312017a95ba3d0c13c38925c91bfc8b097e718dcfe158543758c1e1c5ae1074f"} Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.414692 4765 generic.go:334] "Generic (PLEG): container finished" podID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerID="8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521" exitCode=0 Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.414861 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbxll" event={"ID":"021dd78e-84b2-410d-bb02-9919a7044f3e","Type":"ContainerDied","Data":"8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521"} Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.414898 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbxll" event={"ID":"021dd78e-84b2-410d-bb02-9919a7044f3e","Type":"ContainerStarted","Data":"b37cf8c8451bb3701dded3fd3733ee40a7fa517c9a6832a2bc77e154985a4651"} Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.418472 4765 generic.go:334] "Generic (PLEG): container finished" podID="cdd2bba4-0982-4141-aebe-7abe2af369c5" containerID="d6aa4cb21a5528bfa254a5eb7db754fdbbc966128e264b2d946f6151abfde073" exitCode=0 Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.418568 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cdd2bba4-0982-4141-aebe-7abe2af369c5","Type":"ContainerDied","Data":"d6aa4cb21a5528bfa254a5eb7db754fdbbc966128e264b2d946f6151abfde073"} Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.530769 4765 ???:1] "http: TLS handshake error from 192.168.126.11:35250: no serving certificate available for the kubelet" Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.772789 4765 ???:1] "http: TLS handshake error from 192.168.126.11:35252: no serving certificate available for the kubelet" Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.856551 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:25:59 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:25:59 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:25:59 crc kubenswrapper[4765]: healthz check failed Mar 19 10:25:59 crc kubenswrapper[4765]: I0319 10:25:59.856663 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.146315 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565266-lgbhn"] Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.148321 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.149449 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-lgbhn"] Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.151930 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.245055 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hl9k\" (UniqueName: \"kubernetes.io/projected/cf72b802-ec4b-4a38-b575-d037677fe0dc-kube-api-access-9hl9k\") pod \"auto-csr-approver-29565266-lgbhn\" (UID: \"cf72b802-ec4b-4a38-b575-d037677fe0dc\") " pod="openshift-infra/auto-csr-approver-29565266-lgbhn" Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.346533 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hl9k\" (UniqueName: \"kubernetes.io/projected/cf72b802-ec4b-4a38-b575-d037677fe0dc-kube-api-access-9hl9k\") pod \"auto-csr-approver-29565266-lgbhn\" (UID: \"cf72b802-ec4b-4a38-b575-d037677fe0dc\") " pod="openshift-infra/auto-csr-approver-29565266-lgbhn" Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.368570 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hl9k\" (UniqueName: \"kubernetes.io/projected/cf72b802-ec4b-4a38-b575-d037677fe0dc-kube-api-access-9hl9k\") pod \"auto-csr-approver-29565266-lgbhn\" (UID: \"cf72b802-ec4b-4a38-b575-d037677fe0dc\") " pod="openshift-infra/auto-csr-approver-29565266-lgbhn" Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.465991 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.853055 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:00 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:00 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:00 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:00 crc kubenswrapper[4765]: I0319 10:26:00.853152 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:01 crc kubenswrapper[4765]: I0319 10:26:01.030633 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:26:01 crc kubenswrapper[4765]: I0319 10:26:01.037195 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-56zlb" Mar 19 10:26:01 crc kubenswrapper[4765]: I0319 10:26:01.646388 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-tk8m5" Mar 19 10:26:01 crc kubenswrapper[4765]: I0319 10:26:01.656527 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:26:01 crc kubenswrapper[4765]: I0319 10:26:01.656606 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:26:01 crc kubenswrapper[4765]: I0319 10:26:01.851127 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:01 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:01 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:01 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:01 crc kubenswrapper[4765]: I0319 10:26:01.851188 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:02 crc kubenswrapper[4765]: I0319 10:26:02.080928 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:26:02 crc kubenswrapper[4765]: I0319 10:26:02.851799 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:02 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:02 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:02 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:02 crc kubenswrapper[4765]: I0319 10:26:02.851852 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:03 crc kubenswrapper[4765]: I0319 10:26:03.852729 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:03 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:03 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:03 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:03 crc kubenswrapper[4765]: I0319 10:26:03.853003 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:04 crc kubenswrapper[4765]: I0319 10:26:04.851420 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:04 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:04 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:04 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:04 crc kubenswrapper[4765]: I0319 10:26:04.851472 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:05 crc kubenswrapper[4765]: I0319 10:26:05.851322 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:05 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:05 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:05 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:05 crc kubenswrapper[4765]: I0319 10:26:05.851422 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.011702 4765 patch_prober.go:28] interesting pod/console-f9d7485db-94dnk container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.011754 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-94dnk" podUID="39658af6-59cf-48c7-9015-2271021bd64e" containerName="console" probeResult="failure" output="Get \"https://10.217.0.15:8443/health\": dial tcp 10.217.0.15:8443: connect: connection refused" Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.706682 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-jnmk9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.706896 4765 patch_prober.go:28] interesting pod/downloads-7954f5f757-jnmk9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" start-of-body= Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.707094 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-jnmk9" podUID="4980eaf1-2428-41fe-8a4f-052aace46947" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.707266 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-jnmk9" podUID="4980eaf1-2428-41fe-8a4f-052aace46947" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.17:8080/\": dial tcp 10.217.0.17:8080: connect: connection refused" Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.851102 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:06 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:06 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:06 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:06 crc kubenswrapper[4765]: I0319 10:26:06.851168 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:07 crc kubenswrapper[4765]: I0319 10:26:07.851658 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:07 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:07 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:07 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:07 crc kubenswrapper[4765]: I0319 10:26:07.852133 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:08 crc kubenswrapper[4765]: I0319 10:26:08.851169 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:08 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:08 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:08 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:08 crc kubenswrapper[4765]: I0319 10:26:08.851250 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.077327 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.080785 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.159805 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7007dbf4-9e41-40ae-b2d0-6f7855039174-kubelet-dir\") pod \"7007dbf4-9e41-40ae-b2d0-6f7855039174\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.159980 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdd2bba4-0982-4141-aebe-7abe2af369c5-kubelet-dir\") pod \"cdd2bba4-0982-4141-aebe-7abe2af369c5\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.160069 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdd2bba4-0982-4141-aebe-7abe2af369c5-kube-api-access\") pod \"cdd2bba4-0982-4141-aebe-7abe2af369c5\" (UID: \"cdd2bba4-0982-4141-aebe-7abe2af369c5\") " Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.160076 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cdd2bba4-0982-4141-aebe-7abe2af369c5-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "cdd2bba4-0982-4141-aebe-7abe2af369c5" (UID: "cdd2bba4-0982-4141-aebe-7abe2af369c5"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.160152 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7007dbf4-9e41-40ae-b2d0-6f7855039174-kube-api-access\") pod \"7007dbf4-9e41-40ae-b2d0-6f7855039174\" (UID: \"7007dbf4-9e41-40ae-b2d0-6f7855039174\") " Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.160598 4765 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cdd2bba4-0982-4141-aebe-7abe2af369c5-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.160031 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/7007dbf4-9e41-40ae-b2d0-6f7855039174-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "7007dbf4-9e41-40ae-b2d0-6f7855039174" (UID: "7007dbf4-9e41-40ae-b2d0-6f7855039174"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.181602 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7007dbf4-9e41-40ae-b2d0-6f7855039174-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "7007dbf4-9e41-40ae-b2d0-6f7855039174" (UID: "7007dbf4-9e41-40ae-b2d0-6f7855039174"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.181766 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cdd2bba4-0982-4141-aebe-7abe2af369c5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "cdd2bba4-0982-4141-aebe-7abe2af369c5" (UID: "cdd2bba4-0982-4141-aebe-7abe2af369c5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.262130 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cdd2bba4-0982-4141-aebe-7abe2af369c5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.262533 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/7007dbf4-9e41-40ae-b2d0-6f7855039174-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.262636 4765 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/7007dbf4-9e41-40ae-b2d0-6f7855039174-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.509635 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"cdd2bba4-0982-4141-aebe-7abe2af369c5","Type":"ContainerDied","Data":"ad9ae08e3f2abd1d934ff637438c2126ca2dd7ea881236488d4912118273d840"} Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.509678 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ad9ae08e3f2abd1d934ff637438c2126ca2dd7ea881236488d4912118273d840" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.510264 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.512354 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"7007dbf4-9e41-40ae-b2d0-6f7855039174","Type":"ContainerDied","Data":"d42ab8ca1826bac841f343cb6cb4944eb929469e780e60e95cd4ed79d2757227"} Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.512386 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d42ab8ca1826bac841f343cb6cb4944eb929469e780e60e95cd4ed79d2757227" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.512943 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.796725 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g"] Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.797071 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" podUID="81d42c8d-48dc-4545-b379-32616f6bbab1" containerName="controller-manager" containerID="cri-o://22e75b8d2fce71e2351188e5c71f8bc48ba75ef123d9c6f1a31c932cb6907bcf" gracePeriod=30 Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.814495 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh"] Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.814900 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" podUID="c56525a8-f5a8-4c82-9c27-27da84cc5b63" containerName="route-controller-manager" containerID="cri-o://96ab6990b126b3950e30ecdc7aace87fa80cf3f458a5b7e5195b2d89ddd61216" gracePeriod=30 Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.853364 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:09 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:09 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:09 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:09 crc kubenswrapper[4765]: I0319 10:26:09.853852 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:10 crc kubenswrapper[4765]: I0319 10:26:10.521773 4765 generic.go:334] "Generic (PLEG): container finished" podID="c56525a8-f5a8-4c82-9c27-27da84cc5b63" containerID="96ab6990b126b3950e30ecdc7aace87fa80cf3f458a5b7e5195b2d89ddd61216" exitCode=0 Mar 19 10:26:10 crc kubenswrapper[4765]: I0319 10:26:10.521832 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" event={"ID":"c56525a8-f5a8-4c82-9c27-27da84cc5b63","Type":"ContainerDied","Data":"96ab6990b126b3950e30ecdc7aace87fa80cf3f458a5b7e5195b2d89ddd61216"} Mar 19 10:26:10 crc kubenswrapper[4765]: I0319 10:26:10.851791 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:10 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:10 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:10 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:10 crc kubenswrapper[4765]: I0319 10:26:10.851889 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:11 crc kubenswrapper[4765]: I0319 10:26:11.531675 4765 generic.go:334] "Generic (PLEG): container finished" podID="81d42c8d-48dc-4545-b379-32616f6bbab1" containerID="22e75b8d2fce71e2351188e5c71f8bc48ba75ef123d9c6f1a31c932cb6907bcf" exitCode=0 Mar 19 10:26:11 crc kubenswrapper[4765]: I0319 10:26:11.531755 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" event={"ID":"81d42c8d-48dc-4545-b379-32616f6bbab1","Type":"ContainerDied","Data":"22e75b8d2fce71e2351188e5c71f8bc48ba75ef123d9c6f1a31c932cb6907bcf"} Mar 19 10:26:11 crc kubenswrapper[4765]: I0319 10:26:11.851855 4765 patch_prober.go:28] interesting pod/router-default-5444994796-bj5n6 container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 19 10:26:11 crc kubenswrapper[4765]: [-]has-synced failed: reason withheld Mar 19 10:26:11 crc kubenswrapper[4765]: [+]process-running ok Mar 19 10:26:11 crc kubenswrapper[4765]: healthz check failed Mar 19 10:26:11 crc kubenswrapper[4765]: I0319 10:26:11.851925 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-bj5n6" podUID="fba67808-dc6d-4f9e-bd53-9185baa79d78" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 19 10:26:12 crc kubenswrapper[4765]: I0319 10:26:12.853557 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:26:12 crc kubenswrapper[4765]: I0319 10:26:12.857486 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-bj5n6" Mar 19 10:26:13 crc kubenswrapper[4765]: I0319 10:26:13.387035 4765 patch_prober.go:28] interesting pod/route-controller-manager-5d9546575c-z4vjh container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 19 10:26:13 crc kubenswrapper[4765]: I0319 10:26:13.387116 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" podUID="c56525a8-f5a8-4c82-9c27-27da84cc5b63" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 19 10:26:13 crc kubenswrapper[4765]: E0319 10:26:13.968122 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod401e164a_fc29_412f_ab6e_1c911f6c2d0a.slice/crio-cda267e2aa43cd450829a7f9602e56db8db101ae3799ced658b4328d77c66cc7.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:26:14 crc kubenswrapper[4765]: I0319 10:26:14.554567 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-zhnjs_401e164a-fc29-412f-ab6e-1c911f6c2d0a/cluster-samples-operator/0.log" Mar 19 10:26:14 crc kubenswrapper[4765]: I0319 10:26:14.554984 4765 generic.go:334] "Generic (PLEG): container finished" podID="401e164a-fc29-412f-ab6e-1c911f6c2d0a" containerID="cda267e2aa43cd450829a7f9602e56db8db101ae3799ced658b4328d77c66cc7" exitCode=2 Mar 19 10:26:14 crc kubenswrapper[4765]: I0319 10:26:14.555026 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" event={"ID":"401e164a-fc29-412f-ab6e-1c911f6c2d0a","Type":"ContainerDied","Data":"cda267e2aa43cd450829a7f9602e56db8db101ae3799ced658b4328d77c66cc7"} Mar 19 10:26:14 crc kubenswrapper[4765]: I0319 10:26:14.555923 4765 scope.go:117] "RemoveContainer" containerID="cda267e2aa43cd450829a7f9602e56db8db101ae3799ced658b4328d77c66cc7" Mar 19 10:26:15 crc kubenswrapper[4765]: I0319 10:26:15.725459 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:26:16 crc kubenswrapper[4765]: I0319 10:26:16.038740 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:26:16 crc kubenswrapper[4765]: I0319 10:26:16.043367 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:26:16 crc kubenswrapper[4765]: I0319 10:26:16.676113 4765 patch_prober.go:28] interesting pod/controller-manager-6fbd86f94d-n5h4g container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 19 10:26:16 crc kubenswrapper[4765]: I0319 10:26:16.676535 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" podUID="81d42c8d-48dc-4545-b379-32616f6bbab1" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.50:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 19 10:26:16 crc kubenswrapper[4765]: I0319 10:26:16.722925 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-jnmk9" Mar 19 10:26:20 crc kubenswrapper[4765]: I0319 10:26:20.284432 4765 ???:1] "http: TLS handshake error from 192.168.126.11:38362: no serving certificate available for the kubelet" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.134506 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.142238 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.172580 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k"] Mar 19 10:26:22 crc kubenswrapper[4765]: E0319 10:26:22.173040 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c56525a8-f5a8-4c82-9c27-27da84cc5b63" containerName="route-controller-manager" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173073 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c56525a8-f5a8-4c82-9c27-27da84cc5b63" containerName="route-controller-manager" Mar 19 10:26:22 crc kubenswrapper[4765]: E0319 10:26:22.173090 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cdd2bba4-0982-4141-aebe-7abe2af369c5" containerName="pruner" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173100 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cdd2bba4-0982-4141-aebe-7abe2af369c5" containerName="pruner" Mar 19 10:26:22 crc kubenswrapper[4765]: E0319 10:26:22.173117 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7007dbf4-9e41-40ae-b2d0-6f7855039174" containerName="pruner" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173129 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7007dbf4-9e41-40ae-b2d0-6f7855039174" containerName="pruner" Mar 19 10:26:22 crc kubenswrapper[4765]: E0319 10:26:22.173158 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="81d42c8d-48dc-4545-b379-32616f6bbab1" containerName="controller-manager" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173170 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="81d42c8d-48dc-4545-b379-32616f6bbab1" containerName="controller-manager" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173341 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c56525a8-f5a8-4c82-9c27-27da84cc5b63" containerName="route-controller-manager" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173368 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="cdd2bba4-0982-4141-aebe-7abe2af369c5" containerName="pruner" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173387 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="81d42c8d-48dc-4545-b379-32616f6bbab1" containerName="controller-manager" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.173404 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7007dbf4-9e41-40ae-b2d0-6f7855039174" containerName="pruner" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.174145 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.184547 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k"] Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.186507 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-client-ca\") pod \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.186698 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56525a8-f5a8-4c82-9c27-27da84cc5b63-serving-cert\") pod \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.186779 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-config\") pod \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.186812 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pc4tw\" (UniqueName: \"kubernetes.io/projected/c56525a8-f5a8-4c82-9c27-27da84cc5b63-kube-api-access-pc4tw\") pod \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\" (UID: \"c56525a8-f5a8-4c82-9c27-27da84cc5b63\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.188590 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-client-ca" (OuterVolumeSpecName: "client-ca") pod "c56525a8-f5a8-4c82-9c27-27da84cc5b63" (UID: "c56525a8-f5a8-4c82-9c27-27da84cc5b63"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.189383 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-config" (OuterVolumeSpecName: "config") pod "c56525a8-f5a8-4c82-9c27-27da84cc5b63" (UID: "c56525a8-f5a8-4c82-9c27-27da84cc5b63"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.198871 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c56525a8-f5a8-4c82-9c27-27da84cc5b63-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "c56525a8-f5a8-4c82-9c27-27da84cc5b63" (UID: "c56525a8-f5a8-4c82-9c27-27da84cc5b63"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.201259 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c56525a8-f5a8-4c82-9c27-27da84cc5b63-kube-api-access-pc4tw" (OuterVolumeSpecName: "kube-api-access-pc4tw") pod "c56525a8-f5a8-4c82-9c27-27da84cc5b63" (UID: "c56525a8-f5a8-4c82-9c27-27da84cc5b63"). InnerVolumeSpecName "kube-api-access-pc4tw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288086 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhg9c\" (UniqueName: \"kubernetes.io/projected/81d42c8d-48dc-4545-b379-32616f6bbab1-kube-api-access-vhg9c\") pod \"81d42c8d-48dc-4545-b379-32616f6bbab1\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288169 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81d42c8d-48dc-4545-b379-32616f6bbab1-serving-cert\") pod \"81d42c8d-48dc-4545-b379-32616f6bbab1\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288357 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-config\") pod \"81d42c8d-48dc-4545-b379-32616f6bbab1\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288391 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-client-ca\") pod \"81d42c8d-48dc-4545-b379-32616f6bbab1\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288510 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-proxy-ca-bundles\") pod \"81d42c8d-48dc-4545-b379-32616f6bbab1\" (UID: \"81d42c8d-48dc-4545-b379-32616f6bbab1\") " Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288784 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-config\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288897 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b49d1b9b-eab3-4900-84e3-b46719940115-serving-cert\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288948 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-client-ca\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.288988 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5szjx\" (UniqueName: \"kubernetes.io/projected/b49d1b9b-eab3-4900-84e3-b46719940115-kube-api-access-5szjx\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.289125 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c56525a8-f5a8-4c82-9c27-27da84cc5b63-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.289163 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.289182 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pc4tw\" (UniqueName: \"kubernetes.io/projected/c56525a8-f5a8-4c82-9c27-27da84cc5b63-kube-api-access-pc4tw\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.289199 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/c56525a8-f5a8-4c82-9c27-27da84cc5b63-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.289513 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "81d42c8d-48dc-4545-b379-32616f6bbab1" (UID: "81d42c8d-48dc-4545-b379-32616f6bbab1"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.289621 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-client-ca" (OuterVolumeSpecName: "client-ca") pod "81d42c8d-48dc-4545-b379-32616f6bbab1" (UID: "81d42c8d-48dc-4545-b379-32616f6bbab1"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.289694 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-config" (OuterVolumeSpecName: "config") pod "81d42c8d-48dc-4545-b379-32616f6bbab1" (UID: "81d42c8d-48dc-4545-b379-32616f6bbab1"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.292284 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/81d42c8d-48dc-4545-b379-32616f6bbab1-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "81d42c8d-48dc-4545-b379-32616f6bbab1" (UID: "81d42c8d-48dc-4545-b379-32616f6bbab1"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.292353 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/81d42c8d-48dc-4545-b379-32616f6bbab1-kube-api-access-vhg9c" (OuterVolumeSpecName: "kube-api-access-vhg9c") pod "81d42c8d-48dc-4545-b379-32616f6bbab1" (UID: "81d42c8d-48dc-4545-b379-32616f6bbab1"). InnerVolumeSpecName "kube-api-access-vhg9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.396942 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5szjx\" (UniqueName: \"kubernetes.io/projected/b49d1b9b-eab3-4900-84e3-b46719940115-kube-api-access-5szjx\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.397479 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-config\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.397729 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b49d1b9b-eab3-4900-84e3-b46719940115-serving-cert\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.397888 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-client-ca\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.398116 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.398272 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhg9c\" (UniqueName: \"kubernetes.io/projected/81d42c8d-48dc-4545-b379-32616f6bbab1-kube-api-access-vhg9c\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.398373 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/81d42c8d-48dc-4545-b379-32616f6bbab1-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.398490 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.398630 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/81d42c8d-48dc-4545-b379-32616f6bbab1-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.399000 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-client-ca\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.399268 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-config\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.403144 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b49d1b9b-eab3-4900-84e3-b46719940115-serving-cert\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.419125 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5szjx\" (UniqueName: \"kubernetes.io/projected/b49d1b9b-eab3-4900-84e3-b46719940115-kube-api-access-5szjx\") pod \"route-controller-manager-c47689b85-pmf7k\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:22 crc kubenswrapper[4765]: I0319 10:26:22.523275 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.614364 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" event={"ID":"c56525a8-f5a8-4c82-9c27-27da84cc5b63","Type":"ContainerDied","Data":"56ad1b253bb5582035e2d372c393f48ac5732012798321d67358e4f16c9f3614"} Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.614410 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh" Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.614452 4765 scope.go:117] "RemoveContainer" containerID="96ab6990b126b3950e30ecdc7aace87fa80cf3f458a5b7e5195b2d89ddd61216" Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.616921 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" event={"ID":"81d42c8d-48dc-4545-b379-32616f6bbab1","Type":"ContainerDied","Data":"c19491cb7c1f3da5281cf5a170f646b0ea63c74b3624db63fc7710c4c334ba63"} Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.617122 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g" Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.641596 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh"] Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.650252 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5d9546575c-z4vjh"] Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.659742 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g"] Mar 19 10:26:23 crc kubenswrapper[4765]: I0319 10:26:22.664932 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6fbd86f94d-n5h4g"] Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.363828 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="81d42c8d-48dc-4545-b379-32616f6bbab1" path="/var/lib/kubelet/pods/81d42c8d-48dc-4545-b379-32616f6bbab1/volumes" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.365075 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c56525a8-f5a8-4c82-9c27-27da84cc5b63" path="/var/lib/kubelet/pods/c56525a8-f5a8-4c82-9c27-27da84cc5b63/volumes" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.372907 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6ddc898979-kxzj2"] Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.373778 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.378290 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.378354 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.379261 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.380867 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.380936 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.383422 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.387935 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.388606 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6ddc898979-kxzj2"] Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.538220 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zh7ft\" (UniqueName: \"kubernetes.io/projected/77a0c378-069c-4f51-b005-9916bf1fd3c3-kube-api-access-zh7ft\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.538339 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-proxy-ca-bundles\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.538374 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-config\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.538485 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a0c378-069c-4f51-b005-9916bf1fd3c3-serving-cert\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.538725 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-client-ca\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.640882 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-client-ca\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.641034 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zh7ft\" (UniqueName: \"kubernetes.io/projected/77a0c378-069c-4f51-b005-9916bf1fd3c3-kube-api-access-zh7ft\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.641105 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-proxy-ca-bundles\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.641130 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-config\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.641177 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a0c378-069c-4f51-b005-9916bf1fd3c3-serving-cert\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.642760 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-client-ca\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.643490 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-proxy-ca-bundles\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.644037 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-config\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.650210 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a0c378-069c-4f51-b005-9916bf1fd3c3-serving-cert\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.672374 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zh7ft\" (UniqueName: \"kubernetes.io/projected/77a0c378-069c-4f51-b005-9916bf1fd3c3-kube-api-access-zh7ft\") pod \"controller-manager-6ddc898979-kxzj2\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:24 crc kubenswrapper[4765]: I0319 10:26:24.711341 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:26 crc kubenswrapper[4765]: I0319 10:26:26.877983 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-7xc9q" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.029578 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.031755 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.038018 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.040478 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.041574 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.123814 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2682143-7755-4227-a5fa-a5380cb7a31c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.123936 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2682143-7755-4227-a5fa-a5380cb7a31c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.225935 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2682143-7755-4227-a5fa-a5380cb7a31c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.226108 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2682143-7755-4227-a5fa-a5380cb7a31c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.226190 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2682143-7755-4227-a5fa-a5380cb7a31c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.247908 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2682143-7755-4227-a5fa-a5380cb7a31c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.354038 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.801821 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ddc898979-kxzj2"] Mar 19 10:26:29 crc kubenswrapper[4765]: I0319 10:26:29.893650 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k"] Mar 19 10:26:31 crc kubenswrapper[4765]: I0319 10:26:31.656792 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:26:31 crc kubenswrapper[4765]: I0319 10:26:31.656881 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:26:32 crc kubenswrapper[4765]: E0319 10:26:32.134894 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 10:26:32 crc kubenswrapper[4765]: E0319 10:26:32.135195 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l6zzm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-8mm2f_openshift-marketplace(49e1c321-1087-47b4-a9ef-446e4cef558e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:32 crc kubenswrapper[4765]: E0319 10:26:32.137274 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-8mm2f" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" Mar 19 10:26:32 crc kubenswrapper[4765]: E0319 10:26:32.333865 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 19 10:26:32 crc kubenswrapper[4765]: E0319 10:26:32.334114 4765 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 19 10:26:32 crc kubenswrapper[4765]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 19 10:26:32 crc kubenswrapper[4765]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-g8q5v,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29565264-nvg5v_openshift-infra(c5495eef-efca-4df2-81bb-bd93bb2f8a38): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 19 10:26:32 crc kubenswrapper[4765]: > logger="UnhandledError" Mar 19 10:26:32 crc kubenswrapper[4765]: E0319 10:26:32.335297 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" podUID="c5495eef-efca-4df2-81bb-bd93bb2f8a38" Mar 19 10:26:32 crc kubenswrapper[4765]: E0319 10:26:32.681001 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" podUID="c5495eef-efca-4df2-81bb-bd93bb2f8a38" Mar 19 10:26:33 crc kubenswrapper[4765]: E0319 10:26:33.336892 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-8mm2f" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" Mar 19 10:26:34 crc kubenswrapper[4765]: E0319 10:26:34.636679 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 10:26:34 crc kubenswrapper[4765]: E0319 10:26:34.636888 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qv5lb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-9ddtf_openshift-marketplace(107a869c-7528-417c-a633-e775a88a3cea): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:34 crc kubenswrapper[4765]: E0319 10:26:34.638272 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-9ddtf" podUID="107a869c-7528-417c-a633-e775a88a3cea" Mar 19 10:26:34 crc kubenswrapper[4765]: I0319 10:26:34.815996 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 10:26:34 crc kubenswrapper[4765]: I0319 10:26:34.816895 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:34 crc kubenswrapper[4765]: I0319 10:26:34.825509 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 10:26:34 crc kubenswrapper[4765]: I0319 10:26:34.927012 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68b99dac-eeb2-4875-948b-947030c71066-kube-api-access\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:34 crc kubenswrapper[4765]: I0319 10:26:34.927066 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-kubelet-dir\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:34 crc kubenswrapper[4765]: I0319 10:26:34.927295 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-var-lock\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: I0319 10:26:35.029308 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68b99dac-eeb2-4875-948b-947030c71066-kube-api-access\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: I0319 10:26:35.029367 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-kubelet-dir\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: I0319 10:26:35.029433 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-var-lock\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: I0319 10:26:35.029571 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-var-lock\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: I0319 10:26:35.029987 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-kubelet-dir\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: I0319 10:26:35.048909 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68b99dac-eeb2-4875-948b-947030c71066-kube-api-access\") pod \"installer-9-crc\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: I0319 10:26:35.140535 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:26:35 crc kubenswrapper[4765]: E0319 10:26:35.636037 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-9ddtf" podUID="107a869c-7528-417c-a633-e775a88a3cea" Mar 19 10:26:36 crc kubenswrapper[4765]: E0319 10:26:36.676799 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 10:26:36 crc kubenswrapper[4765]: E0319 10:26:36.676990 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7n6x4,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-84nts_openshift-marketplace(0e281996-1607-4eab-a87f-f4434f4dd17a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:36 crc kubenswrapper[4765]: E0319 10:26:36.678154 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-84nts" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" Mar 19 10:26:37 crc kubenswrapper[4765]: E0319 10:26:37.332889 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 19 10:26:37 crc kubenswrapper[4765]: E0319 10:26:37.333100 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-78gm5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-gf6md_openshift-marketplace(ae8ff072-c71b-412c-88a7-d834fbe98f10): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:37 crc kubenswrapper[4765]: E0319 10:26:37.334634 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-gf6md" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.170655 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-gf6md" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.171061 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-84nts" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.211303 4765 scope.go:117] "RemoveContainer" containerID="22e75b8d2fce71e2351188e5c71f8bc48ba75ef123d9c6f1a31c932cb6907bcf" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.355794 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.356445 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x45zq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-bwctw_openshift-marketplace(926bf3fe-48b1-472a-93bd-da210e7ee945): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.358441 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-bwctw" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.644586 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.645303 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9s2p5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-rdfk6_openshift-marketplace(bc196990-77bd-4e55-9380-1fa14ec297bf): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.646507 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-rdfk6" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.727661 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-samples-operator_cluster-samples-operator-665b6dd947-zhnjs_401e164a-fc29-412f-ab6e-1c911f6c2d0a/cluster-samples-operator/0.log" Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.728694 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zhnjs" event={"ID":"401e164a-fc29-412f-ab6e-1c911f6c2d0a","Type":"ContainerStarted","Data":"4820c4144ac2e9e9d8b8addcf878162f7909fc884aee971c799b6b6328a4a37c"} Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.730441 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-bwctw" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" Mar 19 10:26:38 crc kubenswrapper[4765]: E0319 10:26:38.734084 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-rdfk6" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.750147 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ddc898979-kxzj2"] Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.778889 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.794395 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.838273 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k"] Mar 19 10:26:38 crc kubenswrapper[4765]: I0319 10:26:38.854189 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-lgbhn"] Mar 19 10:26:39 crc kubenswrapper[4765]: I0319 10:26:39.738343 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" event={"ID":"cf72b802-ec4b-4a38-b575-d037677fe0dc","Type":"ContainerStarted","Data":"c10877d1b56bde8fbfb7f06070bfa3b5f3357ecbc1c431bb53624669c3167eb1"} Mar 19 10:26:39 crc kubenswrapper[4765]: I0319 10:26:39.740328 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" event={"ID":"b49d1b9b-eab3-4900-84e3-b46719940115","Type":"ContainerStarted","Data":"1dbeb7b2612e305be4f8f1988780bf0110247a63adddc53fa7e0625b70d313fd"} Mar 19 10:26:39 crc kubenswrapper[4765]: I0319 10:26:39.742303 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" event={"ID":"77a0c378-069c-4f51-b005-9916bf1fd3c3","Type":"ContainerStarted","Data":"6d22fd5c6fd99b2a3d291fb2d3b6e15eba10da7038d35f0940680530ad798651"} Mar 19 10:26:39 crc kubenswrapper[4765]: I0319 10:26:39.743898 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68b99dac-eeb2-4875-948b-947030c71066","Type":"ContainerStarted","Data":"8fe4a48e7602307e58f4ab8200acbd7d3c25267792d8ea8536938b7a5c5c5cd4"} Mar 19 10:26:39 crc kubenswrapper[4765]: I0319 10:26:39.745560 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b2682143-7755-4227-a5fa-a5380cb7a31c","Type":"ContainerStarted","Data":"c6408510ce6afd76ede2a88cadf99c3f90637fae2fbbc3cd1108f96949447bf1"} Mar 19 10:26:40 crc kubenswrapper[4765]: E0319 10:26:40.548768 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Mar 19 10:26:40 crc kubenswrapper[4765]: E0319 10:26:40.549036 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9t2ln,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-cbxll_openshift-marketplace(021dd78e-84b2-410d-bb02-9919a7044f3e): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:40 crc kubenswrapper[4765]: E0319 10:26:40.550290 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-cbxll" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" Mar 19 10:26:40 crc kubenswrapper[4765]: E0319 10:26:40.754279 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-cbxll" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" Mar 19 10:26:41 crc kubenswrapper[4765]: I0319 10:26:41.768503 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" event={"ID":"b49d1b9b-eab3-4900-84e3-b46719940115","Type":"ContainerStarted","Data":"7db127088a1ed8fcecb0a3f0cec9cb2faadd1478074e021831c84891348fcfdb"} Mar 19 10:26:42 crc kubenswrapper[4765]: I0319 10:26:42.785512 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" event={"ID":"77a0c378-069c-4f51-b005-9916bf1fd3c3","Type":"ContainerStarted","Data":"65a684a2153b8ef5f60022098afb026c4fefa0483e2f5f2c4335c87fa6ae0e72"} Mar 19 10:26:42 crc kubenswrapper[4765]: I0319 10:26:42.786680 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68b99dac-eeb2-4875-948b-947030c71066","Type":"ContainerStarted","Data":"c01f5158548ae5bd98d9c3aaf891064f325017b2b31faba1770d5995bcbf8b3c"} Mar 19 10:26:42 crc kubenswrapper[4765]: I0319 10:26:42.787836 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" podUID="b49d1b9b-eab3-4900-84e3-b46719940115" containerName="route-controller-manager" containerID="cri-o://7db127088a1ed8fcecb0a3f0cec9cb2faadd1478074e021831c84891348fcfdb" gracePeriod=30 Mar 19 10:26:42 crc kubenswrapper[4765]: I0319 10:26:42.788174 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b2682143-7755-4227-a5fa-a5380cb7a31c","Type":"ContainerStarted","Data":"898d158864ea1dd52b07985d71f43afef4566d27418b93dd33c0e8df587db03c"} Mar 19 10:26:42 crc kubenswrapper[4765]: I0319 10:26:42.788274 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:42 crc kubenswrapper[4765]: I0319 10:26:42.793939 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:42 crc kubenswrapper[4765]: I0319 10:26:42.809061 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" podStartSLOduration=33.809032094 podStartE2EDuration="33.809032094s" podCreationTimestamp="2026-03-19 10:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:26:42.806933056 +0000 UTC m=+301.155878618" watchObservedRunningTime="2026-03-19 10:26:42.809032094 +0000 UTC m=+301.157977636" Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.795929 4765 generic.go:334] "Generic (PLEG): container finished" podID="b49d1b9b-eab3-4900-84e3-b46719940115" containerID="7db127088a1ed8fcecb0a3f0cec9cb2faadd1478074e021831c84891348fcfdb" exitCode=0 Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.796017 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" event={"ID":"b49d1b9b-eab3-4900-84e3-b46719940115","Type":"ContainerDied","Data":"7db127088a1ed8fcecb0a3f0cec9cb2faadd1478074e021831c84891348fcfdb"} Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.799151 4765 generic.go:334] "Generic (PLEG): container finished" podID="b2682143-7755-4227-a5fa-a5380cb7a31c" containerID="898d158864ea1dd52b07985d71f43afef4566d27418b93dd33c0e8df587db03c" exitCode=0 Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.799439 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b2682143-7755-4227-a5fa-a5380cb7a31c","Type":"ContainerDied","Data":"898d158864ea1dd52b07985d71f43afef4566d27418b93dd33c0e8df587db03c"} Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.800008 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" podUID="77a0c378-069c-4f51-b005-9916bf1fd3c3" containerName="controller-manager" containerID="cri-o://65a684a2153b8ef5f60022098afb026c4fefa0483e2f5f2c4335c87fa6ae0e72" gracePeriod=30 Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.800273 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.813620 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:43 crc kubenswrapper[4765]: E0319 10:26:43.821158 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 19 10:26:43 crc kubenswrapper[4765]: E0319 10:26:43.821559 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nngkf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-rvq7h_openshift-marketplace(4f3a6d2d-3226-4add-983c-2d3574217f12): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 19 10:26:43 crc kubenswrapper[4765]: E0319 10:26:43.822996 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-rvq7h" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.830909 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=9.830884654 podStartE2EDuration="9.830884654s" podCreationTimestamp="2026-03-19 10:26:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:26:43.823754746 +0000 UTC m=+302.172700338" watchObservedRunningTime="2026-03-19 10:26:43.830884654 +0000 UTC m=+302.179830206" Mar 19 10:26:43 crc kubenswrapper[4765]: I0319 10:26:43.863800 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" podStartSLOduration=34.863783937 podStartE2EDuration="34.863783937s" podCreationTimestamp="2026-03-19 10:26:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:26:43.862298726 +0000 UTC m=+302.211244288" watchObservedRunningTime="2026-03-19 10:26:43.863783937 +0000 UTC m=+302.212729479" Mar 19 10:26:44 crc kubenswrapper[4765]: I0319 10:26:44.808753 4765 generic.go:334] "Generic (PLEG): container finished" podID="77a0c378-069c-4f51-b005-9916bf1fd3c3" containerID="65a684a2153b8ef5f60022098afb026c4fefa0483e2f5f2c4335c87fa6ae0e72" exitCode=0 Mar 19 10:26:44 crc kubenswrapper[4765]: I0319 10:26:44.809022 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" event={"ID":"77a0c378-069c-4f51-b005-9916bf1fd3c3","Type":"ContainerDied","Data":"65a684a2153b8ef5f60022098afb026c4fefa0483e2f5f2c4335c87fa6ae0e72"} Mar 19 10:26:45 crc kubenswrapper[4765]: E0319 10:26:45.086080 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-rvq7h" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.133373 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.142684 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.148627 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.176345 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn"] Mar 19 10:26:45 crc kubenswrapper[4765]: E0319 10:26:45.176769 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77a0c378-069c-4f51-b005-9916bf1fd3c3" containerName="controller-manager" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.176812 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="77a0c378-069c-4f51-b005-9916bf1fd3c3" containerName="controller-manager" Mar 19 10:26:45 crc kubenswrapper[4765]: E0319 10:26:45.176835 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b49d1b9b-eab3-4900-84e3-b46719940115" containerName="route-controller-manager" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.176842 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b49d1b9b-eab3-4900-84e3-b46719940115" containerName="route-controller-manager" Mar 19 10:26:45 crc kubenswrapper[4765]: E0319 10:26:45.176853 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2682143-7755-4227-a5fa-a5380cb7a31c" containerName="pruner" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.176862 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2682143-7755-4227-a5fa-a5380cb7a31c" containerName="pruner" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.177062 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b49d1b9b-eab3-4900-84e3-b46719940115" containerName="route-controller-manager" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.177080 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2682143-7755-4227-a5fa-a5380cb7a31c" containerName="pruner" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.177092 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="77a0c378-069c-4f51-b005-9916bf1fd3c3" containerName="controller-manager" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.178997 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.204530 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn"] Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.332697 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b49d1b9b-eab3-4900-84e3-b46719940115-serving-cert\") pod \"b49d1b9b-eab3-4900-84e3-b46719940115\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.332764 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-client-ca\") pod \"b49d1b9b-eab3-4900-84e3-b46719940115\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.332800 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-config\") pod \"77a0c378-069c-4f51-b005-9916bf1fd3c3\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.332857 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zh7ft\" (UniqueName: \"kubernetes.io/projected/77a0c378-069c-4f51-b005-9916bf1fd3c3-kube-api-access-zh7ft\") pod \"77a0c378-069c-4f51-b005-9916bf1fd3c3\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.332919 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-config\") pod \"b49d1b9b-eab3-4900-84e3-b46719940115\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.332938 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5szjx\" (UniqueName: \"kubernetes.io/projected/b49d1b9b-eab3-4900-84e3-b46719940115-kube-api-access-5szjx\") pod \"b49d1b9b-eab3-4900-84e3-b46719940115\" (UID: \"b49d1b9b-eab3-4900-84e3-b46719940115\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333061 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2682143-7755-4227-a5fa-a5380cb7a31c-kubelet-dir\") pod \"b2682143-7755-4227-a5fa-a5380cb7a31c\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333139 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-proxy-ca-bundles\") pod \"77a0c378-069c-4f51-b005-9916bf1fd3c3\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333179 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2682143-7755-4227-a5fa-a5380cb7a31c-kube-api-access\") pod \"b2682143-7755-4227-a5fa-a5380cb7a31c\" (UID: \"b2682143-7755-4227-a5fa-a5380cb7a31c\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333209 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-client-ca\") pod \"77a0c378-069c-4f51-b005-9916bf1fd3c3\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333236 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a0c378-069c-4f51-b005-9916bf1fd3c3-serving-cert\") pod \"77a0c378-069c-4f51-b005-9916bf1fd3c3\" (UID: \"77a0c378-069c-4f51-b005-9916bf1fd3c3\") " Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333396 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-855wx\" (UniqueName: \"kubernetes.io/projected/a80be105-38e1-4ba5-86d3-8e91053b10d0-kube-api-access-855wx\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333437 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-client-ca\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333540 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80be105-38e1-4ba5-86d3-8e91053b10d0-serving-cert\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333561 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2682143-7755-4227-a5fa-a5380cb7a31c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "b2682143-7755-4227-a5fa-a5380cb7a31c" (UID: "b2682143-7755-4227-a5fa-a5380cb7a31c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333588 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-config\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.333644 4765 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/b2682143-7755-4227-a5fa-a5380cb7a31c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.334109 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-client-ca" (OuterVolumeSpecName: "client-ca") pod "b49d1b9b-eab3-4900-84e3-b46719940115" (UID: "b49d1b9b-eab3-4900-84e3-b46719940115"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.334205 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-client-ca" (OuterVolumeSpecName: "client-ca") pod "77a0c378-069c-4f51-b005-9916bf1fd3c3" (UID: "77a0c378-069c-4f51-b005-9916bf1fd3c3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.334308 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-config" (OuterVolumeSpecName: "config") pod "77a0c378-069c-4f51-b005-9916bf1fd3c3" (UID: "77a0c378-069c-4f51-b005-9916bf1fd3c3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.334295 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-config" (OuterVolumeSpecName: "config") pod "b49d1b9b-eab3-4900-84e3-b46719940115" (UID: "b49d1b9b-eab3-4900-84e3-b46719940115"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.335968 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "77a0c378-069c-4f51-b005-9916bf1fd3c3" (UID: "77a0c378-069c-4f51-b005-9916bf1fd3c3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.339951 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2682143-7755-4227-a5fa-a5380cb7a31c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "b2682143-7755-4227-a5fa-a5380cb7a31c" (UID: "b2682143-7755-4227-a5fa-a5380cb7a31c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.340044 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b49d1b9b-eab3-4900-84e3-b46719940115-kube-api-access-5szjx" (OuterVolumeSpecName: "kube-api-access-5szjx") pod "b49d1b9b-eab3-4900-84e3-b46719940115" (UID: "b49d1b9b-eab3-4900-84e3-b46719940115"). InnerVolumeSpecName "kube-api-access-5szjx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.340635 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77a0c378-069c-4f51-b005-9916bf1fd3c3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "77a0c378-069c-4f51-b005-9916bf1fd3c3" (UID: "77a0c378-069c-4f51-b005-9916bf1fd3c3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.341288 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b49d1b9b-eab3-4900-84e3-b46719940115-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "b49d1b9b-eab3-4900-84e3-b46719940115" (UID: "b49d1b9b-eab3-4900-84e3-b46719940115"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.341401 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77a0c378-069c-4f51-b005-9916bf1fd3c3-kube-api-access-zh7ft" (OuterVolumeSpecName: "kube-api-access-zh7ft") pod "77a0c378-069c-4f51-b005-9916bf1fd3c3" (UID: "77a0c378-069c-4f51-b005-9916bf1fd3c3"). InnerVolumeSpecName "kube-api-access-zh7ft". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434569 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80be105-38e1-4ba5-86d3-8e91053b10d0-serving-cert\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434672 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-config\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434718 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-855wx\" (UniqueName: \"kubernetes.io/projected/a80be105-38e1-4ba5-86d3-8e91053b10d0-kube-api-access-855wx\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434752 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-client-ca\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434804 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434820 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5szjx\" (UniqueName: \"kubernetes.io/projected/b49d1b9b-eab3-4900-84e3-b46719940115-kube-api-access-5szjx\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434834 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434846 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/b2682143-7755-4227-a5fa-a5380cb7a31c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434858 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434870 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/77a0c378-069c-4f51-b005-9916bf1fd3c3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434882 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b49d1b9b-eab3-4900-84e3-b46719940115-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.434892 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/b49d1b9b-eab3-4900-84e3-b46719940115-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.435953 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-client-ca\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.436069 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/77a0c378-069c-4f51-b005-9916bf1fd3c3-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.436111 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zh7ft\" (UniqueName: \"kubernetes.io/projected/77a0c378-069c-4f51-b005-9916bf1fd3c3-kube-api-access-zh7ft\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.437085 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-config\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.440384 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80be105-38e1-4ba5-86d3-8e91053b10d0-serving-cert\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.452489 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-855wx\" (UniqueName: \"kubernetes.io/projected/a80be105-38e1-4ba5-86d3-8e91053b10d0-kube-api-access-855wx\") pod \"route-controller-manager-56cffb6447-rwkgn\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.517126 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.712832 4765 patch_prober.go:28] interesting pod/controller-manager-6ddc898979-kxzj2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded" start-of-body= Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.713421 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" podUID="77a0c378-069c-4f51-b005-9916bf1fd3c3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.60:8443/healthz\": context deadline exceeded" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.732817 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn"] Mar 19 10:26:45 crc kubenswrapper[4765]: W0319 10:26:45.749234 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda80be105_38e1_4ba5_86d3_8e91053b10d0.slice/crio-57aa2240f5c2910c7527cf188f9d2a90afc70c792c57def6578b92c9202cd3d3 WatchSource:0}: Error finding container 57aa2240f5c2910c7527cf188f9d2a90afc70c792c57def6578b92c9202cd3d3: Status 404 returned error can't find the container with id 57aa2240f5c2910c7527cf188f9d2a90afc70c792c57def6578b92c9202cd3d3 Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.817223 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"b2682143-7755-4227-a5fa-a5380cb7a31c","Type":"ContainerDied","Data":"c6408510ce6afd76ede2a88cadf99c3f90637fae2fbbc3cd1108f96949447bf1"} Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.817294 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c6408510ce6afd76ede2a88cadf99c3f90637fae2fbbc3cd1108f96949447bf1" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.817398 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.821296 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" event={"ID":"a80be105-38e1-4ba5-86d3-8e91053b10d0","Type":"ContainerStarted","Data":"57aa2240f5c2910c7527cf188f9d2a90afc70c792c57def6578b92c9202cd3d3"} Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.826673 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" event={"ID":"cf72b802-ec4b-4a38-b575-d037677fe0dc","Type":"ContainerStarted","Data":"5ae3e8e4fdb71e67ae4c0d05c82a17bea85182b47a02d646d519d113fbfa7698"} Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.831047 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.831039 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k" event={"ID":"b49d1b9b-eab3-4900-84e3-b46719940115","Type":"ContainerDied","Data":"1dbeb7b2612e305be4f8f1988780bf0110247a63adddc53fa7e0625b70d313fd"} Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.831178 4765 scope.go:117] "RemoveContainer" containerID="7db127088a1ed8fcecb0a3f0cec9cb2faadd1478074e021831c84891348fcfdb" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.833087 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" event={"ID":"77a0c378-069c-4f51-b005-9916bf1fd3c3","Type":"ContainerDied","Data":"6d22fd5c6fd99b2a3d291fb2d3b6e15eba10da7038d35f0940680530ad798651"} Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.833128 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6ddc898979-kxzj2" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.853481 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" podStartSLOduration=39.751302689 podStartE2EDuration="45.85344924s" podCreationTimestamp="2026-03-19 10:26:00 +0000 UTC" firstStartedPulling="2026-03-19 10:26:39.005281013 +0000 UTC m=+297.354226555" lastFinishedPulling="2026-03-19 10:26:45.107427564 +0000 UTC m=+303.456373106" observedRunningTime="2026-03-19 10:26:45.842715142 +0000 UTC m=+304.191660684" watchObservedRunningTime="2026-03-19 10:26:45.85344924 +0000 UTC m=+304.202394782" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.855523 4765 scope.go:117] "RemoveContainer" containerID="65a684a2153b8ef5f60022098afb026c4fefa0483e2f5f2c4335c87fa6ae0e72" Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.862727 4765 csr.go:261] certificate signing request csr-c4bwj is approved, waiting to be issued Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.877848 4765 csr.go:257] certificate signing request csr-c4bwj is issued Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.879607 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6ddc898979-kxzj2"] Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.883703 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6ddc898979-kxzj2"] Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.896984 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k"] Mar 19 10:26:45 crc kubenswrapper[4765]: I0319 10:26:45.901336 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c47689b85-pmf7k"] Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.366453 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77a0c378-069c-4f51-b005-9916bf1fd3c3" path="/var/lib/kubelet/pods/77a0c378-069c-4f51-b005-9916bf1fd3c3/volumes" Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.367890 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b49d1b9b-eab3-4900-84e3-b46719940115" path="/var/lib/kubelet/pods/b49d1b9b-eab3-4900-84e3-b46719940115/volumes" Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.840778 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" event={"ID":"a80be105-38e1-4ba5-86d3-8e91053b10d0","Type":"ContainerStarted","Data":"4c3d6d377a3a041d6abb256f455536f4278e4427b337b93c0672c9c523060398"} Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.841136 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.845453 4765 generic.go:334] "Generic (PLEG): container finished" podID="cf72b802-ec4b-4a38-b575-d037677fe0dc" containerID="5ae3e8e4fdb71e67ae4c0d05c82a17bea85182b47a02d646d519d113fbfa7698" exitCode=0 Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.845537 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" event={"ID":"cf72b802-ec4b-4a38-b575-d037677fe0dc","Type":"ContainerDied","Data":"5ae3e8e4fdb71e67ae4c0d05c82a17bea85182b47a02d646d519d113fbfa7698"} Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.851430 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.879928 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-16 10:57:20.874167507 +0000 UTC Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.879998 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7272h30m33.994172727s for next certificate rotation Mar 19 10:26:46 crc kubenswrapper[4765]: I0319 10:26:46.889495 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" podStartSLOduration=17.889473904 podStartE2EDuration="17.889473904s" podCreationTimestamp="2026-03-19 10:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:26:46.866522197 +0000 UTC m=+305.215467749" watchObservedRunningTime="2026-03-19 10:26:46.889473904 +0000 UTC m=+305.238419466" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.386295 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-76cbf4c64-nzmkg"] Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.387321 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.393175 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.393747 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.399059 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.399326 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.401330 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76cbf4c64-nzmkg"] Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.401808 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.402036 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.404297 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.473670 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-client-ca\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.474281 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-proxy-ca-bundles\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.474333 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-config\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.474383 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mkp2l\" (UniqueName: \"kubernetes.io/projected/3cd360e6-46f8-4b89-930a-b54685c18697-kube-api-access-mkp2l\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.474555 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd360e6-46f8-4b89-930a-b54685c18697-serving-cert\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.576118 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd360e6-46f8-4b89-930a-b54685c18697-serving-cert\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.577433 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-client-ca\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.577636 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-proxy-ca-bundles\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.577716 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-config\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.577797 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mkp2l\" (UniqueName: \"kubernetes.io/projected/3cd360e6-46f8-4b89-930a-b54685c18697-kube-api-access-mkp2l\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.578728 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-client-ca\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.579117 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-proxy-ca-bundles\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.581615 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-config\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.584543 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd360e6-46f8-4b89-930a-b54685c18697-serving-cert\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.600444 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mkp2l\" (UniqueName: \"kubernetes.io/projected/3cd360e6-46f8-4b89-930a-b54685c18697-kube-api-access-mkp2l\") pod \"controller-manager-76cbf4c64-nzmkg\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.723703 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.881062 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-18 23:11:10.529478197 +0000 UTC Mar 19 10:26:47 crc kubenswrapper[4765]: I0319 10:26:47.881125 4765 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 7332h44m22.64835632s for next certificate rotation Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.092862 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.148681 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-76cbf4c64-nzmkg"] Mar 19 10:26:48 crc kubenswrapper[4765]: W0319 10:26:48.162060 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cd360e6_46f8_4b89_930a_b54685c18697.slice/crio-5fe3863356b7401556fef8f558c875d7dc56dd1e6219392f51020412af4e5373 WatchSource:0}: Error finding container 5fe3863356b7401556fef8f558c875d7dc56dd1e6219392f51020412af4e5373: Status 404 returned error can't find the container with id 5fe3863356b7401556fef8f558c875d7dc56dd1e6219392f51020412af4e5373 Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.188568 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9hl9k\" (UniqueName: \"kubernetes.io/projected/cf72b802-ec4b-4a38-b575-d037677fe0dc-kube-api-access-9hl9k\") pod \"cf72b802-ec4b-4a38-b575-d037677fe0dc\" (UID: \"cf72b802-ec4b-4a38-b575-d037677fe0dc\") " Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.195268 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf72b802-ec4b-4a38-b575-d037677fe0dc-kube-api-access-9hl9k" (OuterVolumeSpecName: "kube-api-access-9hl9k") pod "cf72b802-ec4b-4a38-b575-d037677fe0dc" (UID: "cf72b802-ec4b-4a38-b575-d037677fe0dc"). InnerVolumeSpecName "kube-api-access-9hl9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.290548 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9hl9k\" (UniqueName: \"kubernetes.io/projected/cf72b802-ec4b-4a38-b575-d037677fe0dc-kube-api-access-9hl9k\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.863313 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" event={"ID":"c5495eef-efca-4df2-81bb-bd93bb2f8a38","Type":"ContainerStarted","Data":"0bf0e12ea9fc117d627506b4888060ec90e30bfe7dff8e82eb28c4883e4b4cfd"} Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.865521 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.865911 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565266-lgbhn" event={"ID":"cf72b802-ec4b-4a38-b575-d037677fe0dc","Type":"ContainerDied","Data":"c10877d1b56bde8fbfb7f06070bfa3b5f3357ecbc1c431bb53624669c3167eb1"} Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.865999 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c10877d1b56bde8fbfb7f06070bfa3b5f3357ecbc1c431bb53624669c3167eb1" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.870068 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" event={"ID":"3cd360e6-46f8-4b89-930a-b54685c18697","Type":"ContainerStarted","Data":"0b7948baeddf7eaa2d37946679c86731e8aa060c70136556d406344d9461dd6e"} Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.870162 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.870176 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" event={"ID":"3cd360e6-46f8-4b89-930a-b54685c18697","Type":"ContainerStarted","Data":"5fe3863356b7401556fef8f558c875d7dc56dd1e6219392f51020412af4e5373"} Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.875754 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.909094 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" podStartSLOduration=109.089741705 podStartE2EDuration="2m48.909063407s" podCreationTimestamp="2026-03-19 10:24:00 +0000 UTC" firstStartedPulling="2026-03-19 10:25:48.725595268 +0000 UTC m=+247.074540810" lastFinishedPulling="2026-03-19 10:26:48.54491697 +0000 UTC m=+306.893862512" observedRunningTime="2026-03-19 10:26:48.88360548 +0000 UTC m=+307.232551032" watchObservedRunningTime="2026-03-19 10:26:48.909063407 +0000 UTC m=+307.258008949" Mar 19 10:26:48 crc kubenswrapper[4765]: I0319 10:26:48.909450 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" podStartSLOduration=19.909444957 podStartE2EDuration="19.909444957s" podCreationTimestamp="2026-03-19 10:26:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:26:48.906834475 +0000 UTC m=+307.255780027" watchObservedRunningTime="2026-03-19 10:26:48.909444957 +0000 UTC m=+307.258390499" Mar 19 10:26:49 crc kubenswrapper[4765]: I0319 10:26:49.886421 4765 generic.go:334] "Generic (PLEG): container finished" podID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerID="13f65ac4e255c3cb8fa03fbcf460ee9161f6b15d0875e65545005024b0ef72d4" exitCode=0 Mar 19 10:26:49 crc kubenswrapper[4765]: I0319 10:26:49.886504 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mm2f" event={"ID":"49e1c321-1087-47b4-a9ef-446e4cef558e","Type":"ContainerDied","Data":"13f65ac4e255c3cb8fa03fbcf460ee9161f6b15d0875e65545005024b0ef72d4"} Mar 19 10:26:49 crc kubenswrapper[4765]: I0319 10:26:49.890165 4765 generic.go:334] "Generic (PLEG): container finished" podID="c5495eef-efca-4df2-81bb-bd93bb2f8a38" containerID="0bf0e12ea9fc117d627506b4888060ec90e30bfe7dff8e82eb28c4883e4b4cfd" exitCode=0 Mar 19 10:26:49 crc kubenswrapper[4765]: I0319 10:26:49.890230 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" event={"ID":"c5495eef-efca-4df2-81bb-bd93bb2f8a38","Type":"ContainerDied","Data":"0bf0e12ea9fc117d627506b4888060ec90e30bfe7dff8e82eb28c4883e4b4cfd"} Mar 19 10:26:50 crc kubenswrapper[4765]: I0319 10:26:50.902815 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mm2f" event={"ID":"49e1c321-1087-47b4-a9ef-446e4cef558e","Type":"ContainerStarted","Data":"c08c2b344ee219e96638d1552650b343b613249f27a8daec26b6418f5bbefa65"} Mar 19 10:26:50 crc kubenswrapper[4765]: I0319 10:26:50.927932 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-8mm2f" podStartSLOduration=2.992386053 podStartE2EDuration="54.927900688s" podCreationTimestamp="2026-03-19 10:25:56 +0000 UTC" firstStartedPulling="2026-03-19 10:25:58.355684202 +0000 UTC m=+256.704629744" lastFinishedPulling="2026-03-19 10:26:50.291198837 +0000 UTC m=+308.640144379" observedRunningTime="2026-03-19 10:26:50.925849821 +0000 UTC m=+309.274795373" watchObservedRunningTime="2026-03-19 10:26:50.927900688 +0000 UTC m=+309.276846240" Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.195240 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.251364 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g8q5v\" (UniqueName: \"kubernetes.io/projected/c5495eef-efca-4df2-81bb-bd93bb2f8a38-kube-api-access-g8q5v\") pod \"c5495eef-efca-4df2-81bb-bd93bb2f8a38\" (UID: \"c5495eef-efca-4df2-81bb-bd93bb2f8a38\") " Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.259948 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5495eef-efca-4df2-81bb-bd93bb2f8a38-kube-api-access-g8q5v" (OuterVolumeSpecName: "kube-api-access-g8q5v") pod "c5495eef-efca-4df2-81bb-bd93bb2f8a38" (UID: "c5495eef-efca-4df2-81bb-bd93bb2f8a38"). InnerVolumeSpecName "kube-api-access-g8q5v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.353858 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g8q5v\" (UniqueName: \"kubernetes.io/projected/c5495eef-efca-4df2-81bb-bd93bb2f8a38-kube-api-access-g8q5v\") on node \"crc\" DevicePath \"\"" Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.911611 4765 generic.go:334] "Generic (PLEG): container finished" podID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerID="fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca" exitCode=0 Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.911869 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84nts" event={"ID":"0e281996-1607-4eab-a87f-f4434f4dd17a","Type":"ContainerDied","Data":"fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca"} Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.915149 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ddtf" event={"ID":"107a869c-7528-417c-a633-e775a88a3cea","Type":"ContainerStarted","Data":"1d05e4a72115868454b2b59021206d9d77f4e1f1d491dab2074c39f47bf4d8b0"} Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.917634 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" event={"ID":"c5495eef-efca-4df2-81bb-bd93bb2f8a38","Type":"ContainerDied","Data":"f201b92fb1a091f3f8e0794890369c4646767fdbe69f2df2861ab850ba26bd42"} Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.917661 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f201b92fb1a091f3f8e0794890369c4646767fdbe69f2df2861ab850ba26bd42" Mar 19 10:26:51 crc kubenswrapper[4765]: I0319 10:26:51.917762 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565264-nvg5v" Mar 19 10:26:52 crc kubenswrapper[4765]: I0319 10:26:52.927758 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84nts" event={"ID":"0e281996-1607-4eab-a87f-f4434f4dd17a","Type":"ContainerStarted","Data":"89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f"} Mar 19 10:26:52 crc kubenswrapper[4765]: I0319 10:26:52.930032 4765 generic.go:334] "Generic (PLEG): container finished" podID="107a869c-7528-417c-a633-e775a88a3cea" containerID="1d05e4a72115868454b2b59021206d9d77f4e1f1d491dab2074c39f47bf4d8b0" exitCode=0 Mar 19 10:26:52 crc kubenswrapper[4765]: I0319 10:26:52.930071 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ddtf" event={"ID":"107a869c-7528-417c-a633-e775a88a3cea","Type":"ContainerDied","Data":"1d05e4a72115868454b2b59021206d9d77f4e1f1d491dab2074c39f47bf4d8b0"} Mar 19 10:26:52 crc kubenswrapper[4765]: I0319 10:26:52.952583 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-84nts" podStartSLOduration=2.692748709 podStartE2EDuration="58.952551142s" podCreationTimestamp="2026-03-19 10:25:54 +0000 UTC" firstStartedPulling="2026-03-19 10:25:56.138690871 +0000 UTC m=+254.487636413" lastFinishedPulling="2026-03-19 10:26:52.398493304 +0000 UTC m=+310.747438846" observedRunningTime="2026-03-19 10:26:52.951427831 +0000 UTC m=+311.300373363" watchObservedRunningTime="2026-03-19 10:26:52.952551142 +0000 UTC m=+311.301496684" Mar 19 10:26:53 crc kubenswrapper[4765]: I0319 10:26:53.940390 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdfk6" event={"ID":"bc196990-77bd-4e55-9380-1fa14ec297bf","Type":"ContainerStarted","Data":"93261144fa403f4f672b2747aa637f1bdf07c249e4afd591e627d816afeaf16b"} Mar 19 10:26:53 crc kubenswrapper[4765]: I0319 10:26:53.945105 4765 generic.go:334] "Generic (PLEG): container finished" podID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerID="8ef7b97cc52b0f17c5419e5ba0e97f09dcab8b7952818254e42695758d08d84e" exitCode=0 Mar 19 10:26:53 crc kubenswrapper[4765]: I0319 10:26:53.945175 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwctw" event={"ID":"926bf3fe-48b1-472a-93bd-da210e7ee945","Type":"ContainerDied","Data":"8ef7b97cc52b0f17c5419e5ba0e97f09dcab8b7952818254e42695758d08d84e"} Mar 19 10:26:53 crc kubenswrapper[4765]: I0319 10:26:53.947656 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ddtf" event={"ID":"107a869c-7528-417c-a633-e775a88a3cea","Type":"ContainerStarted","Data":"722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9"} Mar 19 10:26:53 crc kubenswrapper[4765]: I0319 10:26:53.951971 4765 generic.go:334] "Generic (PLEG): container finished" podID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerID="59a3798a812e7e0637b96df97c052f5723e5dcd924dd41a900ae0b8b37708d1c" exitCode=0 Mar 19 10:26:53 crc kubenswrapper[4765]: I0319 10:26:53.952015 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6md" event={"ID":"ae8ff072-c71b-412c-88a7-d834fbe98f10","Type":"ContainerDied","Data":"59a3798a812e7e0637b96df97c052f5723e5dcd924dd41a900ae0b8b37708d1c"} Mar 19 10:26:54 crc kubenswrapper[4765]: I0319 10:26:54.041895 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9ddtf" podStartSLOduration=3.017702783 podStartE2EDuration="57.041874825s" podCreationTimestamp="2026-03-19 10:25:57 +0000 UTC" firstStartedPulling="2026-03-19 10:25:59.386903746 +0000 UTC m=+257.735849298" lastFinishedPulling="2026-03-19 10:26:53.411075798 +0000 UTC m=+311.760021340" observedRunningTime="2026-03-19 10:26:54.02331507 +0000 UTC m=+312.372260612" watchObservedRunningTime="2026-03-19 10:26:54.041874825 +0000 UTC m=+312.390820357" Mar 19 10:26:54 crc kubenswrapper[4765]: I0319 10:26:54.968937 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6md" event={"ID":"ae8ff072-c71b-412c-88a7-d834fbe98f10","Type":"ContainerStarted","Data":"fdfec617eac7f4631aceac191cb15adbe17b1398a1e272884235fd5cfbe64b99"} Mar 19 10:26:54 crc kubenswrapper[4765]: I0319 10:26:54.971389 4765 generic.go:334] "Generic (PLEG): container finished" podID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerID="93261144fa403f4f672b2747aa637f1bdf07c249e4afd591e627d816afeaf16b" exitCode=0 Mar 19 10:26:54 crc kubenswrapper[4765]: I0319 10:26:54.971453 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdfk6" event={"ID":"bc196990-77bd-4e55-9380-1fa14ec297bf","Type":"ContainerDied","Data":"93261144fa403f4f672b2747aa637f1bdf07c249e4afd591e627d816afeaf16b"} Mar 19 10:26:54 crc kubenswrapper[4765]: I0319 10:26:54.978980 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwctw" event={"ID":"926bf3fe-48b1-472a-93bd-da210e7ee945","Type":"ContainerStarted","Data":"e2ac22b6f45a2e206d80a6f7d5820f167cab6d95fcb470e2f503cc6999fdd88f"} Mar 19 10:26:54 crc kubenswrapper[4765]: I0319 10:26:54.997146 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-gf6md" podStartSLOduration=2.652668604 podStartE2EDuration="59.997122047s" podCreationTimestamp="2026-03-19 10:25:55 +0000 UTC" firstStartedPulling="2026-03-19 10:25:57.186125296 +0000 UTC m=+255.535070848" lastFinishedPulling="2026-03-19 10:26:54.530578749 +0000 UTC m=+312.879524291" observedRunningTime="2026-03-19 10:26:54.993633301 +0000 UTC m=+313.342578863" watchObservedRunningTime="2026-03-19 10:26:54.997122047 +0000 UTC m=+313.346067589" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.021841 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-bwctw" podStartSLOduration=1.711352542 podStartE2EDuration="1m0.021815063s" podCreationTimestamp="2026-03-19 10:25:55 +0000 UTC" firstStartedPulling="2026-03-19 10:25:56.164232887 +0000 UTC m=+254.513178429" lastFinishedPulling="2026-03-19 10:26:54.474695408 +0000 UTC m=+312.823640950" observedRunningTime="2026-03-19 10:26:55.016438924 +0000 UTC m=+313.365384466" watchObservedRunningTime="2026-03-19 10:26:55.021815063 +0000 UTC m=+313.370760605" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.193866 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.193994 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.389546 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.411132 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.411192 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.648899 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.649038 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:26:55 crc kubenswrapper[4765]: I0319 10:26:55.986943 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdfk6" event={"ID":"bc196990-77bd-4e55-9380-1fa14ec297bf","Type":"ContainerStarted","Data":"80c26f1af13018b4e3fd24cb42eff50ad38ed62a68f1711275377a2ad9a000b7"} Mar 19 10:26:56 crc kubenswrapper[4765]: I0319 10:26:56.019929 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-rdfk6" podStartSLOduration=2.667650687 podStartE2EDuration="1m2.019904835s" podCreationTimestamp="2026-03-19 10:25:54 +0000 UTC" firstStartedPulling="2026-03-19 10:25:56.108508278 +0000 UTC m=+254.457453820" lastFinishedPulling="2026-03-19 10:26:55.460762426 +0000 UTC m=+313.809707968" observedRunningTime="2026-03-19 10:26:56.01576355 +0000 UTC m=+314.364709112" watchObservedRunningTime="2026-03-19 10:26:56.019904835 +0000 UTC m=+314.368850377" Mar 19 10:26:56 crc kubenswrapper[4765]: I0319 10:26:56.435439 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-8sk6m"] Mar 19 10:26:56 crc kubenswrapper[4765]: I0319 10:26:56.460309 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-bwctw" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="registry-server" probeResult="failure" output=< Mar 19 10:26:56 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 10:26:56 crc kubenswrapper[4765]: > Mar 19 10:26:56 crc kubenswrapper[4765]: I0319 10:26:56.705042 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-gf6md" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="registry-server" probeResult="failure" output=< Mar 19 10:26:56 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 10:26:56 crc kubenswrapper[4765]: > Mar 19 10:26:56 crc kubenswrapper[4765]: I0319 10:26:56.970887 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:26:56 crc kubenswrapper[4765]: I0319 10:26:56.970999 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:26:57 crc kubenswrapper[4765]: I0319 10:26:57.075160 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:26:58 crc kubenswrapper[4765]: I0319 10:26:58.004000 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbxll" event={"ID":"021dd78e-84b2-410d-bb02-9919a7044f3e","Type":"ContainerStarted","Data":"e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297"} Mar 19 10:26:58 crc kubenswrapper[4765]: I0319 10:26:58.049664 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:26:58 crc kubenswrapper[4765]: I0319 10:26:58.196588 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:26:58 crc kubenswrapper[4765]: I0319 10:26:58.196662 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:26:59 crc kubenswrapper[4765]: I0319 10:26:59.020635 4765 generic.go:334] "Generic (PLEG): container finished" podID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerID="e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297" exitCode=0 Mar 19 10:26:59 crc kubenswrapper[4765]: I0319 10:26:59.021080 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbxll" event={"ID":"021dd78e-84b2-410d-bb02-9919a7044f3e","Type":"ContainerDied","Data":"e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297"} Mar 19 10:26:59 crc kubenswrapper[4765]: I0319 10:26:59.238465 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9ddtf" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="registry-server" probeResult="failure" output=< Mar 19 10:26:59 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 10:26:59 crc kubenswrapper[4765]: > Mar 19 10:27:00 crc kubenswrapper[4765]: I0319 10:27:00.035325 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbxll" event={"ID":"021dd78e-84b2-410d-bb02-9919a7044f3e","Type":"ContainerStarted","Data":"bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4"} Mar 19 10:27:00 crc kubenswrapper[4765]: I0319 10:27:00.056633 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-cbxll" podStartSLOduration=1.673130103 podStartE2EDuration="1m2.05660646s" podCreationTimestamp="2026-03-19 10:25:58 +0000 UTC" firstStartedPulling="2026-03-19 10:25:59.441468113 +0000 UTC m=+257.790413645" lastFinishedPulling="2026-03-19 10:26:59.82494447 +0000 UTC m=+318.173890002" observedRunningTime="2026-03-19 10:27:00.055979562 +0000 UTC m=+318.404925124" watchObservedRunningTime="2026-03-19 10:27:00.05660646 +0000 UTC m=+318.405552002" Mar 19 10:27:01 crc kubenswrapper[4765]: I0319 10:27:01.043912 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerID="49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f" exitCode=0 Mar 19 10:27:01 crc kubenswrapper[4765]: I0319 10:27:01.043973 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvq7h" event={"ID":"4f3a6d2d-3226-4add-983c-2d3574217f12","Type":"ContainerDied","Data":"49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f"} Mar 19 10:27:01 crc kubenswrapper[4765]: I0319 10:27:01.656586 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:27:01 crc kubenswrapper[4765]: I0319 10:27:01.657087 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:27:01 crc kubenswrapper[4765]: I0319 10:27:01.657240 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:27:01 crc kubenswrapper[4765]: I0319 10:27:01.658257 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:27:01 crc kubenswrapper[4765]: I0319 10:27:01.658458 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d" gracePeriod=600 Mar 19 10:27:02 crc kubenswrapper[4765]: I0319 10:27:02.053760 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvq7h" event={"ID":"4f3a6d2d-3226-4add-983c-2d3574217f12","Type":"ContainerStarted","Data":"be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7"} Mar 19 10:27:02 crc kubenswrapper[4765]: I0319 10:27:02.057447 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d" exitCode=0 Mar 19 10:27:02 crc kubenswrapper[4765]: I0319 10:27:02.057493 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d"} Mar 19 10:27:02 crc kubenswrapper[4765]: I0319 10:27:02.076765 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rvq7h" podStartSLOduration=2.009717688 podStartE2EDuration="1m5.076738397s" podCreationTimestamp="2026-03-19 10:25:57 +0000 UTC" firstStartedPulling="2026-03-19 10:25:58.367068562 +0000 UTC m=+256.716014104" lastFinishedPulling="2026-03-19 10:27:01.434089261 +0000 UTC m=+319.783034813" observedRunningTime="2026-03-19 10:27:02.075902694 +0000 UTC m=+320.424848236" watchObservedRunningTime="2026-03-19 10:27:02.076738397 +0000 UTC m=+320.425683939" Mar 19 10:27:03 crc kubenswrapper[4765]: I0319 10:27:03.066003 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"77d960355abace30efa3218c2c2218608f2c437a4f4180a38603f11b6f6f7a6e"} Mar 19 10:27:04 crc kubenswrapper[4765]: I0319 10:27:04.980759 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:27:04 crc kubenswrapper[4765]: I0319 10:27:04.981351 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:27:05 crc kubenswrapper[4765]: I0319 10:27:05.041545 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:27:05 crc kubenswrapper[4765]: I0319 10:27:05.122846 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:27:05 crc kubenswrapper[4765]: I0319 10:27:05.245575 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:27:05 crc kubenswrapper[4765]: I0319 10:27:05.459940 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:27:05 crc kubenswrapper[4765]: I0319 10:27:05.505811 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:27:05 crc kubenswrapper[4765]: I0319 10:27:05.700765 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:27:05 crc kubenswrapper[4765]: I0319 10:27:05.741855 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:27:07 crc kubenswrapper[4765]: I0319 10:27:07.482454 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:27:07 crc kubenswrapper[4765]: I0319 10:27:07.485008 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:27:07 crc kubenswrapper[4765]: I0319 10:27:07.533107 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:27:07 crc kubenswrapper[4765]: I0319 10:27:07.594243 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gf6md"] Mar 19 10:27:07 crc kubenswrapper[4765]: I0319 10:27:07.594521 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-gf6md" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="registry-server" containerID="cri-o://fdfec617eac7f4631aceac191cb15adbe17b1398a1e272884235fd5cfbe64b99" gracePeriod=2 Mar 19 10:27:07 crc kubenswrapper[4765]: I0319 10:27:07.796187 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwctw"] Mar 19 10:27:07 crc kubenswrapper[4765]: I0319 10:27:07.796887 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-bwctw" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="registry-server" containerID="cri-o://e2ac22b6f45a2e206d80a6f7d5820f167cab6d95fcb470e2f503cc6999fdd88f" gracePeriod=2 Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.097015 4765 generic.go:334] "Generic (PLEG): container finished" podID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerID="e2ac22b6f45a2e206d80a6f7d5820f167cab6d95fcb470e2f503cc6999fdd88f" exitCode=0 Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.097125 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwctw" event={"ID":"926bf3fe-48b1-472a-93bd-da210e7ee945","Type":"ContainerDied","Data":"e2ac22b6f45a2e206d80a6f7d5820f167cab6d95fcb470e2f503cc6999fdd88f"} Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.111618 4765 generic.go:334] "Generic (PLEG): container finished" podID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerID="fdfec617eac7f4631aceac191cb15adbe17b1398a1e272884235fd5cfbe64b99" exitCode=0 Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.111986 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6md" event={"ID":"ae8ff072-c71b-412c-88a7-d834fbe98f10","Type":"ContainerDied","Data":"fdfec617eac7f4631aceac191cb15adbe17b1398a1e272884235fd5cfbe64b99"} Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.192255 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.203260 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.229616 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-78gm5\" (UniqueName: \"kubernetes.io/projected/ae8ff072-c71b-412c-88a7-d834fbe98f10-kube-api-access-78gm5\") pod \"ae8ff072-c71b-412c-88a7-d834fbe98f10\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.229699 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-utilities\") pod \"ae8ff072-c71b-412c-88a7-d834fbe98f10\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.229859 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-catalog-content\") pod \"ae8ff072-c71b-412c-88a7-d834fbe98f10\" (UID: \"ae8ff072-c71b-412c-88a7-d834fbe98f10\") " Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.239854 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-utilities" (OuterVolumeSpecName: "utilities") pod "ae8ff072-c71b-412c-88a7-d834fbe98f10" (UID: "ae8ff072-c71b-412c-88a7-d834fbe98f10"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.277154 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ae8ff072-c71b-412c-88a7-d834fbe98f10-kube-api-access-78gm5" (OuterVolumeSpecName: "kube-api-access-78gm5") pod "ae8ff072-c71b-412c-88a7-d834fbe98f10" (UID: "ae8ff072-c71b-412c-88a7-d834fbe98f10"). InnerVolumeSpecName "kube-api-access-78gm5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.293909 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.311752 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ae8ff072-c71b-412c-88a7-d834fbe98f10" (UID: "ae8ff072-c71b-412c-88a7-d834fbe98f10"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.328095 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.331900 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.331950 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-78gm5\" (UniqueName: \"kubernetes.io/projected/ae8ff072-c71b-412c-88a7-d834fbe98f10-kube-api-access-78gm5\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.331988 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ae8ff072-c71b-412c-88a7-d834fbe98f10-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.389954 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.433732 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x45zq\" (UniqueName: \"kubernetes.io/projected/926bf3fe-48b1-472a-93bd-da210e7ee945-kube-api-access-x45zq\") pod \"926bf3fe-48b1-472a-93bd-da210e7ee945\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.433848 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-utilities\") pod \"926bf3fe-48b1-472a-93bd-da210e7ee945\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.433928 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-catalog-content\") pod \"926bf3fe-48b1-472a-93bd-da210e7ee945\" (UID: \"926bf3fe-48b1-472a-93bd-da210e7ee945\") " Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.437156 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-utilities" (OuterVolumeSpecName: "utilities") pod "926bf3fe-48b1-472a-93bd-da210e7ee945" (UID: "926bf3fe-48b1-472a-93bd-da210e7ee945"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.438101 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/926bf3fe-48b1-472a-93bd-da210e7ee945-kube-api-access-x45zq" (OuterVolumeSpecName: "kube-api-access-x45zq") pod "926bf3fe-48b1-472a-93bd-da210e7ee945" (UID: "926bf3fe-48b1-472a-93bd-da210e7ee945"). InnerVolumeSpecName "kube-api-access-x45zq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.503214 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "926bf3fe-48b1-472a-93bd-da210e7ee945" (UID: "926bf3fe-48b1-472a-93bd-da210e7ee945"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.535739 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.535805 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/926bf3fe-48b1-472a-93bd-da210e7ee945-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.535819 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x45zq\" (UniqueName: \"kubernetes.io/projected/926bf3fe-48b1-472a-93bd-da210e7ee945-kube-api-access-x45zq\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.644049 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.644461 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:27:08 crc kubenswrapper[4765]: I0319 10:27:08.698219 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.136715 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-gf6md" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.136706 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-gf6md" event={"ID":"ae8ff072-c71b-412c-88a7-d834fbe98f10","Type":"ContainerDied","Data":"5449dfff0affcf2be11e4fd48c8a979503daaca962f185bfa21a47fa54d83287"} Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.136866 4765 scope.go:117] "RemoveContainer" containerID="fdfec617eac7f4631aceac191cb15adbe17b1398a1e272884235fd5cfbe64b99" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.144491 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-bwctw" event={"ID":"926bf3fe-48b1-472a-93bd-da210e7ee945","Type":"ContainerDied","Data":"6326d376af9fd4a90a4483df1c303f9c15c334035fac5a6d1044a240c9870172"} Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.144872 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-bwctw" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.173305 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-gf6md"] Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.179561 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-gf6md"] Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.179725 4765 scope.go:117] "RemoveContainer" containerID="59a3798a812e7e0637b96df97c052f5723e5dcd924dd41a900ae0b8b37708d1c" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.184720 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-bwctw"] Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.188576 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-bwctw"] Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.199358 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.209065 4765 scope.go:117] "RemoveContainer" containerID="9b1a73a2603b1f7bbd4e0140dbb1d27960a836580520723d6bc82d12a054d229" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.227835 4765 scope.go:117] "RemoveContainer" containerID="e2ac22b6f45a2e206d80a6f7d5820f167cab6d95fcb470e2f503cc6999fdd88f" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.247186 4765 scope.go:117] "RemoveContainer" containerID="8ef7b97cc52b0f17c5419e5ba0e97f09dcab8b7952818254e42695758d08d84e" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.270905 4765 scope.go:117] "RemoveContainer" containerID="8ce3548ec8213cf2ee2ffb8fa751b5a710f9a2b3b703e74fc32d49dd1b074fe3" Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.786409 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76cbf4c64-nzmkg"] Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.787249 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" podUID="3cd360e6-46f8-4b89-930a-b54685c18697" containerName="controller-manager" containerID="cri-o://0b7948baeddf7eaa2d37946679c86731e8aa060c70136556d406344d9461dd6e" gracePeriod=30 Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.887358 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn"] Mar 19 10:27:09 crc kubenswrapper[4765]: I0319 10:27:09.888125 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" podUID="a80be105-38e1-4ba5-86d3-8e91053b10d0" containerName="route-controller-manager" containerID="cri-o://4c3d6d377a3a041d6abb256f455536f4278e4427b337b93c0672c9c523060398" gracePeriod=30 Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.001015 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvq7h"] Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.152407 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" event={"ID":"3cd360e6-46f8-4b89-930a-b54685c18697","Type":"ContainerDied","Data":"0b7948baeddf7eaa2d37946679c86731e8aa060c70136556d406344d9461dd6e"} Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.152369 4765 generic.go:334] "Generic (PLEG): container finished" podID="3cd360e6-46f8-4b89-930a-b54685c18697" containerID="0b7948baeddf7eaa2d37946679c86731e8aa060c70136556d406344d9461dd6e" exitCode=0 Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.156636 4765 generic.go:334] "Generic (PLEG): container finished" podID="a80be105-38e1-4ba5-86d3-8e91053b10d0" containerID="4c3d6d377a3a041d6abb256f455536f4278e4427b337b93c0672c9c523060398" exitCode=0 Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.156696 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" event={"ID":"a80be105-38e1-4ba5-86d3-8e91053b10d0","Type":"ContainerDied","Data":"4c3d6d377a3a041d6abb256f455536f4278e4427b337b93c0672c9c523060398"} Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.157924 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-rvq7h" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="registry-server" containerID="cri-o://be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7" gracePeriod=2 Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.364919 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" path="/var/lib/kubelet/pods/926bf3fe-48b1-472a-93bd-da210e7ee945/volumes" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.365813 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" path="/var/lib/kubelet/pods/ae8ff072-c71b-412c-88a7-d834fbe98f10/volumes" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.380790 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.385216 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475584 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd360e6-46f8-4b89-930a-b54685c18697-serving-cert\") pod \"3cd360e6-46f8-4b89-930a-b54685c18697\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475696 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80be105-38e1-4ba5-86d3-8e91053b10d0-serving-cert\") pod \"a80be105-38e1-4ba5-86d3-8e91053b10d0\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475739 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-client-ca\") pod \"3cd360e6-46f8-4b89-930a-b54685c18697\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475780 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-855wx\" (UniqueName: \"kubernetes.io/projected/a80be105-38e1-4ba5-86d3-8e91053b10d0-kube-api-access-855wx\") pod \"a80be105-38e1-4ba5-86d3-8e91053b10d0\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475845 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-config\") pod \"a80be105-38e1-4ba5-86d3-8e91053b10d0\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475890 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mkp2l\" (UniqueName: \"kubernetes.io/projected/3cd360e6-46f8-4b89-930a-b54685c18697-kube-api-access-mkp2l\") pod \"3cd360e6-46f8-4b89-930a-b54685c18697\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475934 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-config\") pod \"3cd360e6-46f8-4b89-930a-b54685c18697\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.475985 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-client-ca\") pod \"a80be105-38e1-4ba5-86d3-8e91053b10d0\" (UID: \"a80be105-38e1-4ba5-86d3-8e91053b10d0\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.476016 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-proxy-ca-bundles\") pod \"3cd360e6-46f8-4b89-930a-b54685c18697\" (UID: \"3cd360e6-46f8-4b89-930a-b54685c18697\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.477289 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "3cd360e6-46f8-4b89-930a-b54685c18697" (UID: "3cd360e6-46f8-4b89-930a-b54685c18697"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.478890 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-config" (OuterVolumeSpecName: "config") pod "a80be105-38e1-4ba5-86d3-8e91053b10d0" (UID: "a80be105-38e1-4ba5-86d3-8e91053b10d0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.479650 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-client-ca" (OuterVolumeSpecName: "client-ca") pod "3cd360e6-46f8-4b89-930a-b54685c18697" (UID: "3cd360e6-46f8-4b89-930a-b54685c18697"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.480143 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-config" (OuterVolumeSpecName: "config") pod "3cd360e6-46f8-4b89-930a-b54685c18697" (UID: "3cd360e6-46f8-4b89-930a-b54685c18697"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.482852 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-client-ca" (OuterVolumeSpecName: "client-ca") pod "a80be105-38e1-4ba5-86d3-8e91053b10d0" (UID: "a80be105-38e1-4ba5-86d3-8e91053b10d0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.483834 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a80be105-38e1-4ba5-86d3-8e91053b10d0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a80be105-38e1-4ba5-86d3-8e91053b10d0" (UID: "a80be105-38e1-4ba5-86d3-8e91053b10d0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.483860 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cd360e6-46f8-4b89-930a-b54685c18697-kube-api-access-mkp2l" (OuterVolumeSpecName: "kube-api-access-mkp2l") pod "3cd360e6-46f8-4b89-930a-b54685c18697" (UID: "3cd360e6-46f8-4b89-930a-b54685c18697"). InnerVolumeSpecName "kube-api-access-mkp2l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.483985 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3cd360e6-46f8-4b89-930a-b54685c18697-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "3cd360e6-46f8-4b89-930a-b54685c18697" (UID: "3cd360e6-46f8-4b89-930a-b54685c18697"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.485215 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a80be105-38e1-4ba5-86d3-8e91053b10d0-kube-api-access-855wx" (OuterVolumeSpecName: "kube-api-access-855wx") pod "a80be105-38e1-4ba5-86d3-8e91053b10d0" (UID: "a80be105-38e1-4ba5-86d3-8e91053b10d0"). InnerVolumeSpecName "kube-api-access-855wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.507827 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.577431 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nngkf\" (UniqueName: \"kubernetes.io/projected/4f3a6d2d-3226-4add-983c-2d3574217f12-kube-api-access-nngkf\") pod \"4f3a6d2d-3226-4add-983c-2d3574217f12\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.577562 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-catalog-content\") pod \"4f3a6d2d-3226-4add-983c-2d3574217f12\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.577644 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-utilities\") pod \"4f3a6d2d-3226-4add-983c-2d3574217f12\" (UID: \"4f3a6d2d-3226-4add-983c-2d3574217f12\") " Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578061 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578105 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578130 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/3cd360e6-46f8-4b89-930a-b54685c18697-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578152 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a80be105-38e1-4ba5-86d3-8e91053b10d0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578170 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578188 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-855wx\" (UniqueName: \"kubernetes.io/projected/a80be105-38e1-4ba5-86d3-8e91053b10d0-kube-api-access-855wx\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578200 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a80be105-38e1-4ba5-86d3-8e91053b10d0-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578213 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mkp2l\" (UniqueName: \"kubernetes.io/projected/3cd360e6-46f8-4b89-930a-b54685c18697-kube-api-access-mkp2l\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.578225 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/3cd360e6-46f8-4b89-930a-b54685c18697-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.580928 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-utilities" (OuterVolumeSpecName: "utilities") pod "4f3a6d2d-3226-4add-983c-2d3574217f12" (UID: "4f3a6d2d-3226-4add-983c-2d3574217f12"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.582341 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f3a6d2d-3226-4add-983c-2d3574217f12-kube-api-access-nngkf" (OuterVolumeSpecName: "kube-api-access-nngkf") pod "4f3a6d2d-3226-4add-983c-2d3574217f12" (UID: "4f3a6d2d-3226-4add-983c-2d3574217f12"). InnerVolumeSpecName "kube-api-access-nngkf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.605597 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4f3a6d2d-3226-4add-983c-2d3574217f12" (UID: "4f3a6d2d-3226-4add-983c-2d3574217f12"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.680285 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nngkf\" (UniqueName: \"kubernetes.io/projected/4f3a6d2d-3226-4add-983c-2d3574217f12-kube-api-access-nngkf\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.680344 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:10 crc kubenswrapper[4765]: I0319 10:27:10.680354 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4f3a6d2d-3226-4add-983c-2d3574217f12-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.165497 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" event={"ID":"a80be105-38e1-4ba5-86d3-8e91053b10d0","Type":"ContainerDied","Data":"57aa2240f5c2910c7527cf188f9d2a90afc70c792c57def6578b92c9202cd3d3"} Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.165567 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.165901 4765 scope.go:117] "RemoveContainer" containerID="4c3d6d377a3a041d6abb256f455536f4278e4427b337b93c0672c9c523060398" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.168979 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.169181 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-76cbf4c64-nzmkg" event={"ID":"3cd360e6-46f8-4b89-930a-b54685c18697","Type":"ContainerDied","Data":"5fe3863356b7401556fef8f558c875d7dc56dd1e6219392f51020412af4e5373"} Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.172207 4765 generic.go:334] "Generic (PLEG): container finished" podID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerID="be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7" exitCode=0 Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.172273 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvq7h" event={"ID":"4f3a6d2d-3226-4add-983c-2d3574217f12","Type":"ContainerDied","Data":"be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7"} Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.172344 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rvq7h" event={"ID":"4f3a6d2d-3226-4add-983c-2d3574217f12","Type":"ContainerDied","Data":"72c2ab8993a0db5915bf4f79b30fea2fa1eaf10c4e732dfb7c43d7e3bd26b117"} Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.172910 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rvq7h" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.187924 4765 scope.go:117] "RemoveContainer" containerID="0b7948baeddf7eaa2d37946679c86731e8aa060c70136556d406344d9461dd6e" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.204479 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.208178 4765 scope.go:117] "RemoveContainer" containerID="be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.209503 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-56cffb6447-rwkgn"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.220451 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvq7h"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.223667 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-rvq7h"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.237853 4765 scope.go:117] "RemoveContainer" containerID="49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.238378 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-76cbf4c64-nzmkg"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.242225 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-76cbf4c64-nzmkg"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.261950 4765 scope.go:117] "RemoveContainer" containerID="44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.278617 4765 scope.go:117] "RemoveContainer" containerID="be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.279173 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7\": container with ID starting with be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7 not found: ID does not exist" containerID="be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.279211 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7"} err="failed to get container status \"be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7\": rpc error: code = NotFound desc = could not find container \"be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7\": container with ID starting with be90c5a32e905643643864c329a5dd85bd3c736c6481c29b5c9877f50095cac7 not found: ID does not exist" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.279241 4765 scope.go:117] "RemoveContainer" containerID="49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.283330 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f\": container with ID starting with 49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f not found: ID does not exist" containerID="49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.283358 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f"} err="failed to get container status \"49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f\": rpc error: code = NotFound desc = could not find container \"49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f\": container with ID starting with 49807758b3a07252cb9b8dc9e95991c7fbf44f1231ce88f53219016d14c75f4f not found: ID does not exist" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.283375 4765 scope.go:117] "RemoveContainer" containerID="44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.283780 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b\": container with ID starting with 44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b not found: ID does not exist" containerID="44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.283800 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b"} err="failed to get container status \"44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b\": rpc error: code = NotFound desc = could not find container \"44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b\": container with ID starting with 44341ad7aad4eb7e57c344aeb125fb79eb2c09f0966756a4277647833ae0c13b not found: ID does not exist" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.410984 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-56797d6658-v62bz"] Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.411816 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a80be105-38e1-4ba5-86d3-8e91053b10d0" containerName="route-controller-manager" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.411840 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a80be105-38e1-4ba5-86d3-8e91053b10d0" containerName="route-controller-manager" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.411863 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.411870 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.411888 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="extract-utilities" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.411923 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="extract-utilities" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.411939 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="extract-content" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.411946 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="extract-content" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.411974 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="extract-utilities" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.411981 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="extract-utilities" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.411989 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.411996 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.412005 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="extract-content" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412012 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="extract-content" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.412022 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="extract-utilities" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412050 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="extract-utilities" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.412061 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5495eef-efca-4df2-81bb-bd93bb2f8a38" containerName="oc" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412069 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5495eef-efca-4df2-81bb-bd93bb2f8a38" containerName="oc" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.412082 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cd360e6-46f8-4b89-930a-b54685c18697" containerName="controller-manager" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412089 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cd360e6-46f8-4b89-930a-b54685c18697" containerName="controller-manager" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.412117 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf72b802-ec4b-4a38-b575-d037677fe0dc" containerName="oc" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412124 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf72b802-ec4b-4a38-b575-d037677fe0dc" containerName="oc" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.412138 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412144 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: E0319 10:27:11.412153 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="extract-content" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412160 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="extract-content" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412318 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="926bf3fe-48b1-472a-93bd-da210e7ee945" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412327 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cd360e6-46f8-4b89-930a-b54685c18697" containerName="controller-manager" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412340 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a80be105-38e1-4ba5-86d3-8e91053b10d0" containerName="route-controller-manager" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412350 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf72b802-ec4b-4a38-b575-d037677fe0dc" containerName="oc" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412362 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5495eef-efca-4df2-81bb-bd93bb2f8a38" containerName="oc" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412369 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ae8ff072-c71b-412c-88a7-d834fbe98f10" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412380 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" containerName="registry-server" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.412990 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.415237 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.416687 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.418680 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.418940 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.423013 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.425541 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.425727 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.425864 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.425915 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.426057 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.426115 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.426131 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56797d6658-v62bz"] Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.426202 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.426230 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.426341 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.426357 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.428051 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.493210 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsdjp\" (UniqueName: \"kubernetes.io/projected/a791f1a0-5735-45b7-b984-5299cca0c90c-kube-api-access-rsdjp\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.493276 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-proxy-ca-bundles\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.493316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-client-ca\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.493574 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-client-ca\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.493708 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cbrj2\" (UniqueName: \"kubernetes.io/projected/a8e40efe-3ec1-479d-a2ca-53f44efc838e-kube-api-access-cbrj2\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.493880 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e40efe-3ec1-479d-a2ca-53f44efc838e-serving-cert\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.494003 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-config\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.494188 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-config\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.494306 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a791f1a0-5735-45b7-b984-5299cca0c90c-serving-cert\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.596025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e40efe-3ec1-479d-a2ca-53f44efc838e-serving-cert\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.596375 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-config\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.596503 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-config\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.596645 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a791f1a0-5735-45b7-b984-5299cca0c90c-serving-cert\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.596783 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsdjp\" (UniqueName: \"kubernetes.io/projected/a791f1a0-5735-45b7-b984-5299cca0c90c-kube-api-access-rsdjp\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.596895 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-proxy-ca-bundles\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.597018 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-client-ca\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.597143 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-client-ca\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.597279 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cbrj2\" (UniqueName: \"kubernetes.io/projected/a8e40efe-3ec1-479d-a2ca-53f44efc838e-kube-api-access-cbrj2\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.597896 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-config\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.598151 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-config\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.599275 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-proxy-ca-bundles\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.599289 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-client-ca\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.601070 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-client-ca\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.601454 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a791f1a0-5735-45b7-b984-5299cca0c90c-serving-cert\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.611531 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e40efe-3ec1-479d-a2ca-53f44efc838e-serving-cert\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.617441 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsdjp\" (UniqueName: \"kubernetes.io/projected/a791f1a0-5735-45b7-b984-5299cca0c90c-kube-api-access-rsdjp\") pod \"controller-manager-56797d6658-v62bz\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.617480 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cbrj2\" (UniqueName: \"kubernetes.io/projected/a8e40efe-3ec1-479d-a2ca-53f44efc838e-kube-api-access-cbrj2\") pod \"route-controller-manager-5db9d4cbbf-wg8x8\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.754240 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:11 crc kubenswrapper[4765]: I0319 10:27:11.760738 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.053311 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-56797d6658-v62bz"] Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.179694 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" event={"ID":"a791f1a0-5735-45b7-b984-5299cca0c90c","Type":"ContainerStarted","Data":"3150bd30b5d29a5563fc3f11592ed9b8237d114a8d5b12eb8ce287e72a22e9e4"} Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.203071 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8"] Mar 19 10:27:12 crc kubenswrapper[4765]: W0319 10:27:12.216469 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda8e40efe_3ec1_479d_a2ca_53f44efc838e.slice/crio-d806782ff03c981527b18466aa3d72cf17f9cfd409066871f13c2bc1e848b048 WatchSource:0}: Error finding container d806782ff03c981527b18466aa3d72cf17f9cfd409066871f13c2bc1e848b048: Status 404 returned error can't find the container with id d806782ff03c981527b18466aa3d72cf17f9cfd409066871f13c2bc1e848b048 Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.367347 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cd360e6-46f8-4b89-930a-b54685c18697" path="/var/lib/kubelet/pods/3cd360e6-46f8-4b89-930a-b54685c18697/volumes" Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.368152 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f3a6d2d-3226-4add-983c-2d3574217f12" path="/var/lib/kubelet/pods/4f3a6d2d-3226-4add-983c-2d3574217f12/volumes" Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.368933 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a80be105-38e1-4ba5-86d3-8e91053b10d0" path="/var/lib/kubelet/pods/a80be105-38e1-4ba5-86d3-8e91053b10d0/volumes" Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.420679 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cbxll"] Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.421387 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-cbxll" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="registry-server" containerID="cri-o://bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4" gracePeriod=2 Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.878268 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.929796 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-catalog-content\") pod \"021dd78e-84b2-410d-bb02-9919a7044f3e\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.929872 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-utilities\") pod \"021dd78e-84b2-410d-bb02-9919a7044f3e\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.929981 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9t2ln\" (UniqueName: \"kubernetes.io/projected/021dd78e-84b2-410d-bb02-9919a7044f3e-kube-api-access-9t2ln\") pod \"021dd78e-84b2-410d-bb02-9919a7044f3e\" (UID: \"021dd78e-84b2-410d-bb02-9919a7044f3e\") " Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.930716 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-utilities" (OuterVolumeSpecName: "utilities") pod "021dd78e-84b2-410d-bb02-9919a7044f3e" (UID: "021dd78e-84b2-410d-bb02-9919a7044f3e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:12 crc kubenswrapper[4765]: I0319 10:27:12.939115 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/021dd78e-84b2-410d-bb02-9919a7044f3e-kube-api-access-9t2ln" (OuterVolumeSpecName: "kube-api-access-9t2ln") pod "021dd78e-84b2-410d-bb02-9919a7044f3e" (UID: "021dd78e-84b2-410d-bb02-9919a7044f3e"). InnerVolumeSpecName "kube-api-access-9t2ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.032369 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.032418 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9t2ln\" (UniqueName: \"kubernetes.io/projected/021dd78e-84b2-410d-bb02-9919a7044f3e-kube-api-access-9t2ln\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.103683 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "021dd78e-84b2-410d-bb02-9919a7044f3e" (UID: "021dd78e-84b2-410d-bb02-9919a7044f3e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.134060 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/021dd78e-84b2-410d-bb02-9919a7044f3e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.209003 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" event={"ID":"a8e40efe-3ec1-479d-a2ca-53f44efc838e","Type":"ContainerStarted","Data":"1723156ef706dca453a08f5dd0237fd21b38381d71d874316bbd5fd9a4574ae5"} Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.209058 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" event={"ID":"a8e40efe-3ec1-479d-a2ca-53f44efc838e","Type":"ContainerStarted","Data":"d806782ff03c981527b18466aa3d72cf17f9cfd409066871f13c2bc1e848b048"} Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.210268 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.212229 4765 generic.go:334] "Generic (PLEG): container finished" podID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerID="bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4" exitCode=0 Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.212289 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbxll" event={"ID":"021dd78e-84b2-410d-bb02-9919a7044f3e","Type":"ContainerDied","Data":"bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4"} Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.212309 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-cbxll" event={"ID":"021dd78e-84b2-410d-bb02-9919a7044f3e","Type":"ContainerDied","Data":"b37cf8c8451bb3701dded3fd3733ee40a7fa517c9a6832a2bc77e154985a4651"} Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.212330 4765 scope.go:117] "RemoveContainer" containerID="bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.212457 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-cbxll" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.227615 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" event={"ID":"a791f1a0-5735-45b7-b984-5299cca0c90c","Type":"ContainerStarted","Data":"a87e1125030372fe0ba2f835a7217bf97ae19fcaeed4f5ec7f891b71d1e1b0aa"} Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.228129 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.243295 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" podStartSLOduration=4.243265581 podStartE2EDuration="4.243265581s" podCreationTimestamp="2026-03-19 10:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:27:13.241246447 +0000 UTC m=+331.590192009" watchObservedRunningTime="2026-03-19 10:27:13.243265581 +0000 UTC m=+331.592211123" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.245623 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.259402 4765 scope.go:117] "RemoveContainer" containerID="e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.265420 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" podStartSLOduration=4.265391898 podStartE2EDuration="4.265391898s" podCreationTimestamp="2026-03-19 10:27:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:27:13.26361418 +0000 UTC m=+331.612559722" watchObservedRunningTime="2026-03-19 10:27:13.265391898 +0000 UTC m=+331.614337440" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.277292 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.287859 4765 scope.go:117] "RemoveContainer" containerID="8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.295898 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-cbxll"] Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.306462 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-cbxll"] Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.306929 4765 scope.go:117] "RemoveContainer" containerID="bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4" Mar 19 10:27:13 crc kubenswrapper[4765]: E0319 10:27:13.307946 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4\": container with ID starting with bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4 not found: ID does not exist" containerID="bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.308021 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4"} err="failed to get container status \"bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4\": rpc error: code = NotFound desc = could not find container \"bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4\": container with ID starting with bf66ec33f20818ffa880ff347a6ab435e4dfc2d802774b8d1332f8bf378b0ea4 not found: ID does not exist" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.308057 4765 scope.go:117] "RemoveContainer" containerID="e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297" Mar 19 10:27:13 crc kubenswrapper[4765]: E0319 10:27:13.315199 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297\": container with ID starting with e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297 not found: ID does not exist" containerID="e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.315261 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297"} err="failed to get container status \"e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297\": rpc error: code = NotFound desc = could not find container \"e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297\": container with ID starting with e501fb90109deadfe6e8e5ee213da9c472f27d98ef44b556eee908f3656cf297 not found: ID does not exist" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.315298 4765 scope.go:117] "RemoveContainer" containerID="8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521" Mar 19 10:27:13 crc kubenswrapper[4765]: E0319 10:27:13.319086 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521\": container with ID starting with 8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521 not found: ID does not exist" containerID="8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521" Mar 19 10:27:13 crc kubenswrapper[4765]: I0319 10:27:13.319150 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521"} err="failed to get container status \"8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521\": rpc error: code = NotFound desc = could not find container \"8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521\": container with ID starting with 8838507e0552a36b0691fd5dd2535c83d431b08cc6c3018ca710dd4956cc7521 not found: ID does not exist" Mar 19 10:27:14 crc kubenswrapper[4765]: I0319 10:27:14.365653 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" path="/var/lib/kubelet/pods/021dd78e-84b2-410d-bb02-9919a7044f3e/volumes" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.017577 4765 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.018617 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="extract-utilities" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.018636 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="extract-utilities" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.018654 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="extract-content" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.018662 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="extract-content" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.018676 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="registry-server" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.018684 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="registry-server" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.018823 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="021dd78e-84b2-410d-bb02-9919a7044f3e" containerName="registry-server" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019351 4765 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019550 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019709 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127" gracePeriod=15 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019763 4765 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019654 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079" gracePeriod=15 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019774 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236" gracePeriod=15 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019711 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe" gracePeriod=15 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.019729 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929" gracePeriod=15 Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020235 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020253 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020264 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020271 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020282 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020289 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020296 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020303 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020312 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020329 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020339 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020347 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020360 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020367 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020380 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020388 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020400 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020406 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020527 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020542 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020551 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020560 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020568 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020577 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020585 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.020695 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020703 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020799 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.020806 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.024414 4765 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="f4b27818a5e8e43d0dc095d08835c792" podUID="71bb4a3aecc4ba5b26c4b7318770ce13" Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.064467 4765 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.145692 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.146192 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.146223 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.146249 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.146275 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.146748 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.146802 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.146840 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.248906 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249262 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249048 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249368 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249644 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249683 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249822 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249856 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249854 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249947 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249894 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.250047 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249921 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.249905 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.250044 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.250281 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.276594 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.278767 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.279594 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127" exitCode=0 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.279641 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929" exitCode=0 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.279652 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe" exitCode=0 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.279661 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236" exitCode=2 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.279742 4765 scope.go:117] "RemoveContainer" containerID="678a92414c2ec4e0cc3df89429f4428d10e8e930c20bfc216ef035f0ccf9b2d1" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.282097 4765 generic.go:334] "Generic (PLEG): container finished" podID="68b99dac-eeb2-4875-948b-947030c71066" containerID="c01f5158548ae5bd98d9c3aaf891064f325017b2b31faba1770d5995bcbf8b3c" exitCode=0 Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.282138 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68b99dac-eeb2-4875-948b-947030c71066","Type":"ContainerDied","Data":"c01f5158548ae5bd98d9c3aaf891064f325017b2b31faba1770d5995bcbf8b3c"} Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.283137 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:20 crc kubenswrapper[4765]: I0319 10:27:20.365844 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:20 crc kubenswrapper[4765]: W0319 10:27:20.395556 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-6262216ca3c0cbb4049b5c25180f01026a49284cd409f63408d9bbb0d30a4270 WatchSource:0}: Error finding container 6262216ca3c0cbb4049b5c25180f01026a49284cd409f63408d9bbb0d30a4270: Status 404 returned error can't find the container with id 6262216ca3c0cbb4049b5c25180f01026a49284cd409f63408d9bbb0d30a4270 Mar 19 10:27:20 crc kubenswrapper[4765]: E0319 10:27:20.399180 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.13:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e373d79482679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:27:20.397858425 +0000 UTC m=+338.746803967,LastTimestamp:2026-03-19 10:27:20.397858425 +0000 UTC m=+338.746803967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:27:21 crc kubenswrapper[4765]: E0319 10:27:21.018234 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.13:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e373d79482679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:27:20.397858425 +0000 UTC m=+338.746803967,LastTimestamp:2026-03-19 10:27:20.397858425 +0000 UTC m=+338.746803967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.290139 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67"} Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.290220 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"6262216ca3c0cbb4049b5c25180f01026a49284cd409f63408d9bbb0d30a4270"} Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.291254 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:21 crc kubenswrapper[4765]: E0319 10:27:21.291522 4765 kubelet.go:1929] "Failed creating a mirror pod for" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods\": dial tcp 38.129.56.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.294335 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.477215 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" containerName="oauth-openshift" containerID="cri-o://51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a" gracePeriod=15 Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.645667 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.646376 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.677727 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-var-lock\") pod \"68b99dac-eeb2-4875-948b-947030c71066\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.677871 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68b99dac-eeb2-4875-948b-947030c71066-kube-api-access\") pod \"68b99dac-eeb2-4875-948b-947030c71066\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.677972 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-kubelet-dir\") pod \"68b99dac-eeb2-4875-948b-947030c71066\" (UID: \"68b99dac-eeb2-4875-948b-947030c71066\") " Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.678185 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-var-lock" (OuterVolumeSpecName: "var-lock") pod "68b99dac-eeb2-4875-948b-947030c71066" (UID: "68b99dac-eeb2-4875-948b-947030c71066"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.678389 4765 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.678447 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "68b99dac-eeb2-4875-948b-947030c71066" (UID: "68b99dac-eeb2-4875-948b-947030c71066"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.687435 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68b99dac-eeb2-4875-948b-947030c71066-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "68b99dac-eeb2-4875-948b-947030c71066" (UID: "68b99dac-eeb2-4875-948b-947030c71066"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.780416 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/68b99dac-eeb2-4875-948b-947030c71066-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.780488 4765 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/68b99dac-eeb2-4875-948b-947030c71066-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.966450 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.967566 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:21 crc kubenswrapper[4765]: I0319 10:27:21.968012 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.084738 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-error\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.084931 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-provider-selection\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085040 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z5fxc\" (UniqueName: \"kubernetes.io/projected/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-kube-api-access-z5fxc\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085094 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-session\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085122 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-dir\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085157 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-serving-cert\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085183 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-router-certs\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085212 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-service-ca\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085241 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-login\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085273 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-trusted-ca-bundle\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085305 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-cliconfig\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085353 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-idp-0-file-data\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085400 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-ocp-branding-template\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.085435 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-policies\") pod \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\" (UID: \"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.086808 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.087713 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.090203 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.090711 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.090781 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.090862 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.098260 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.101632 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.103632 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.105777 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.117248 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.118838 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.120035 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.131373 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-kube-api-access-z5fxc" (OuterVolumeSpecName: "kube-api-access-z5fxc") pod "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" (UID: "331c5a49-dffb-4c14-ab1b-1b41bfd8f09f"). InnerVolumeSpecName "kube-api-access-z5fxc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187148 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187223 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187237 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z5fxc\" (UniqueName: \"kubernetes.io/projected/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-kube-api-access-z5fxc\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187253 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187263 4765 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187272 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187283 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187293 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187303 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187316 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187327 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187340 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187352 4765 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.187362 4765 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.302391 4765 generic.go:334] "Generic (PLEG): container finished" podID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" containerID="51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a" exitCode=0 Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.302448 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.302469 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" event={"ID":"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f","Type":"ContainerDied","Data":"51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a"} Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.302654 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" event={"ID":"331c5a49-dffb-4c14-ab1b-1b41bfd8f09f","Type":"ContainerDied","Data":"d8a08ad87358385224b7c93c14fb6898124d997385218d2feb739bce98dfe991"} Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.302674 4765 scope.go:117] "RemoveContainer" containerID="51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.303540 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.303860 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.305137 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"68b99dac-eeb2-4875-948b-947030c71066","Type":"ContainerDied","Data":"8fe4a48e7602307e58f4ab8200acbd7d3c25267792d8ea8536938b7a5c5c5cd4"} Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.305163 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8fe4a48e7602307e58f4ab8200acbd7d3c25267792d8ea8536938b7a5c5c5cd4" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.305203 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.359696 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.360598 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.388840 4765 scope.go:117] "RemoveContainer" containerID="51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a" Mar 19 10:27:22 crc kubenswrapper[4765]: E0319 10:27:22.389531 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a\": container with ID starting with 51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a not found: ID does not exist" containerID="51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.389580 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a"} err="failed to get container status \"51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a\": rpc error: code = NotFound desc = could not find container \"51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a\": container with ID starting with 51decba02909f0333b6bfea08230359ef7374e3f74e4f96add5299dbef367c5a not found: ID does not exist" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.392529 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.392847 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.392937 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.393911 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.394238 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.394579 4765 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.395149 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491051 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491164 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491231 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491252 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491349 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491454 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491648 4765 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491671 4765 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:22 crc kubenswrapper[4765]: I0319 10:27:22.491683 4765 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.316927 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.318344 4765 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079" exitCode=0 Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.318460 4765 scope.go:117] "RemoveContainer" containerID="0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.318550 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.333231 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.333997 4765 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.335010 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.337564 4765 scope.go:117] "RemoveContainer" containerID="34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.356326 4765 scope.go:117] "RemoveContainer" containerID="429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.379570 4765 scope.go:117] "RemoveContainer" containerID="35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.396730 4765 scope.go:117] "RemoveContainer" containerID="412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.413978 4765 scope.go:117] "RemoveContainer" containerID="47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.437724 4765 scope.go:117] "RemoveContainer" containerID="0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127" Mar 19 10:27:23 crc kubenswrapper[4765]: E0319 10:27:23.438515 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\": container with ID starting with 0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127 not found: ID does not exist" containerID="0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.438553 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127"} err="failed to get container status \"0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\": rpc error: code = NotFound desc = could not find container \"0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127\": container with ID starting with 0e02034a99a13e8fff837861c5d6cf08246a4d6faabd86c2d8c6d64f73a5c127 not found: ID does not exist" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.438615 4765 scope.go:117] "RemoveContainer" containerID="34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929" Mar 19 10:27:23 crc kubenswrapper[4765]: E0319 10:27:23.439315 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\": container with ID starting with 34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929 not found: ID does not exist" containerID="34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.439361 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929"} err="failed to get container status \"34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\": rpc error: code = NotFound desc = could not find container \"34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929\": container with ID starting with 34076a51c98d8d70872c268b3b0d75d6ea3ed768d0328becaf196e0bfd04a929 not found: ID does not exist" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.439399 4765 scope.go:117] "RemoveContainer" containerID="429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe" Mar 19 10:27:23 crc kubenswrapper[4765]: E0319 10:27:23.439893 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\": container with ID starting with 429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe not found: ID does not exist" containerID="429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.439934 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe"} err="failed to get container status \"429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\": rpc error: code = NotFound desc = could not find container \"429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe\": container with ID starting with 429e417d394699518a74739b912f740656f64f6e62c9cb54598b90b7199c0fbe not found: ID does not exist" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.439953 4765 scope.go:117] "RemoveContainer" containerID="35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236" Mar 19 10:27:23 crc kubenswrapper[4765]: E0319 10:27:23.440807 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\": container with ID starting with 35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236 not found: ID does not exist" containerID="35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.440837 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236"} err="failed to get container status \"35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\": rpc error: code = NotFound desc = could not find container \"35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236\": container with ID starting with 35bcafd447e0852275cc9753a43992e8d364c78b148409b5945a72b149a13236 not found: ID does not exist" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.440860 4765 scope.go:117] "RemoveContainer" containerID="412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079" Mar 19 10:27:23 crc kubenswrapper[4765]: E0319 10:27:23.441207 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\": container with ID starting with 412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079 not found: ID does not exist" containerID="412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.441229 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079"} err="failed to get container status \"412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\": rpc error: code = NotFound desc = could not find container \"412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079\": container with ID starting with 412404f241df1deeaba6111774175594b6e41d6f7e355f16b71ebfb7d91b7079 not found: ID does not exist" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.441245 4765 scope.go:117] "RemoveContainer" containerID="47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c" Mar 19 10:27:23 crc kubenswrapper[4765]: E0319 10:27:23.441485 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\": container with ID starting with 47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c not found: ID does not exist" containerID="47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c" Mar 19 10:27:23 crc kubenswrapper[4765]: I0319 10:27:23.441569 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c"} err="failed to get container status \"47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\": rpc error: code = NotFound desc = could not find container \"47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c\": container with ID starting with 47c162f2d1d55306ee2f6e2565b6a0a0c584126a48505cd545041ec4a8ba059c not found: ID does not exist" Mar 19 10:27:24 crc kubenswrapper[4765]: I0319 10:27:24.364328 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.390134 4765 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.129.56.13:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" volumeName="registry-storage" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.409523 4765 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.410367 4765 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.411037 4765 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.411661 4765 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.412096 4765 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:24 crc kubenswrapper[4765]: I0319 10:27:24.412136 4765 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.412531 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="200ms" Mar 19 10:27:24 crc kubenswrapper[4765]: E0319 10:27:24.613893 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="400ms" Mar 19 10:27:25 crc kubenswrapper[4765]: E0319 10:27:25.015230 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="800ms" Mar 19 10:27:25 crc kubenswrapper[4765]: I0319 10:27:25.333806 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:27:25 crc kubenswrapper[4765]: W0319 10:27:25.334586 4765 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:25 crc kubenswrapper[4765]: E0319 10:27:25.334746 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:25 crc kubenswrapper[4765]: I0319 10:27:25.435698 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:27:25 crc kubenswrapper[4765]: I0319 10:27:25.435786 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:27:25 crc kubenswrapper[4765]: I0319 10:27:25.435839 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:27:25 crc kubenswrapper[4765]: W0319 10:27:25.437606 4765 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:25 crc kubenswrapper[4765]: E0319 10:27:25.437731 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:25 crc kubenswrapper[4765]: W0319 10:27:25.437796 4765 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27257": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:25 crc kubenswrapper[4765]: E0319 10:27:25.437917 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27257\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:25 crc kubenswrapper[4765]: E0319 10:27:25.816948 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="1.6s" Mar 19 10:27:26 crc kubenswrapper[4765]: E0319 10:27:26.335278 4765 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:26 crc kubenswrapper[4765]: E0319 10:27:26.335413 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:29:28.33538093 +0000 UTC m=+466.684326482 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:26 crc kubenswrapper[4765]: E0319 10:27:26.436830 4765 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: failed to sync secret cache: timed out waiting for the condition Mar 19 10:27:26 crc kubenswrapper[4765]: E0319 10:27:26.436885 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:26 crc kubenswrapper[4765]: E0319 10:27:26.437006 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-19 10:29:28.436938629 +0000 UTC m=+466.785884211 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : failed to sync secret cache: timed out waiting for the condition Mar 19 10:27:26 crc kubenswrapper[4765]: E0319 10:27:26.437069 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:26 crc kubenswrapper[4765]: W0319 10:27:26.437486 4765 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:26 crc kubenswrapper[4765]: E0319 10:27:26.437607 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:27 crc kubenswrapper[4765]: W0319 10:27:27.250218 4765 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.250315 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:27 crc kubenswrapper[4765]: W0319 10:27:27.326146 4765 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27257": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.326262 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27257\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.419311 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="3.2s" Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.438023 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.438057 4765 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.438101 4765 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.438178 4765 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.438130 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-19 10:29:29.438107416 +0000 UTC m=+467.787052958 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.438401 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-19 10:29:29.438329412 +0000 UTC m=+467.787274954 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : failed to sync configmap cache: timed out waiting for the condition Mar 19 10:27:27 crc kubenswrapper[4765]: W0319 10:27:27.847237 4765 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:27 crc kubenswrapper[4765]: E0319 10:27:27.847319 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:29 crc kubenswrapper[4765]: W0319 10:27:29.161114 4765 reflector.go:561] object-"openshift-network-diagnostics"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:29 crc kubenswrapper[4765]: E0319 10:27:29.161798 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dopenshift-service-ca.crt&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:30 crc kubenswrapper[4765]: E0319 10:27:30.620647 4765 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.129.56.13:6443: connect: connection refused" interval="6.4s" Mar 19 10:27:30 crc kubenswrapper[4765]: W0319 10:27:30.814921 4765 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin-cert": failed to list *v1.Secret: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27257": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:30 crc kubenswrapper[4765]: E0319 10:27:30.815098 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin-cert\": Failed to watch *v1.Secret: failed to list *v1.Secret: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/secrets?fieldSelector=metadata.name%3Dnetworking-console-plugin-cert&resourceVersion=27257\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:31 crc kubenswrapper[4765]: E0319 10:27:31.020660 4765 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.129.56.13:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e373d79482679 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-19 10:27:20.397858425 +0000 UTC m=+338.746803967,LastTimestamp:2026-03-19 10:27:20.397858425 +0000 UTC m=+338.746803967,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.361166 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.361490 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.362044 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.362747 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.365405 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.390783 4765 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.390840 4765 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:32 crc kubenswrapper[4765]: E0319 10:27:32.391579 4765 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:32 crc kubenswrapper[4765]: I0319 10:27:32.392284 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:32 crc kubenswrapper[4765]: W0319 10:27:32.981874 4765 reflector.go:561] object-"openshift-network-diagnostics"/"kube-root-ca.crt": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:32 crc kubenswrapper[4765]: E0319 10:27:32.983109 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-diagnostics/configmaps?fieldSelector=metadata.name%3Dkube-root-ca.crt&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:33 crc kubenswrapper[4765]: I0319 10:27:33.395950 4765 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="6ee670d404525606763f7b58eb9e5daa23f7a2bb05c827ed8c05377649cec3e6" exitCode=0 Mar 19 10:27:33 crc kubenswrapper[4765]: I0319 10:27:33.396021 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"6ee670d404525606763f7b58eb9e5daa23f7a2bb05c827ed8c05377649cec3e6"} Mar 19 10:27:33 crc kubenswrapper[4765]: I0319 10:27:33.396102 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"eb995f2b92e43f0f4f8fd186d12a436cca7a08eb9322a06258c37778bfcbcf97"} Mar 19 10:27:33 crc kubenswrapper[4765]: I0319 10:27:33.396427 4765 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:33 crc kubenswrapper[4765]: I0319 10:27:33.396446 4765 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:33 crc kubenswrapper[4765]: I0319 10:27:33.397203 4765 status_manager.go:851] "Failed to get status for pod" podUID="68b99dac-eeb2-4875-948b-947030c71066" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:33 crc kubenswrapper[4765]: E0319 10:27:33.397207 4765 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.129.56.13:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:33 crc kubenswrapper[4765]: I0319 10:27:33.397517 4765 status_manager.go:851] "Failed to get status for pod" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" pod="openshift-authentication/oauth-openshift-558db77b4-8sk6m" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-authentication/pods/oauth-openshift-558db77b4-8sk6m\": dial tcp 38.129.56.13:6443: connect: connection refused" Mar 19 10:27:33 crc kubenswrapper[4765]: W0319 10:27:33.441893 4765 reflector.go:561] object-"openshift-network-console"/"networking-console-plugin": failed to list *v1.ConfigMap: Get "https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27269": dial tcp 38.129.56.13:6443: connect: connection refused Mar 19 10:27:33 crc kubenswrapper[4765]: E0319 10:27:33.442498 4765 reflector.go:158] "Unhandled Error" err="object-\"openshift-network-console\"/\"networking-console-plugin\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-network-console/configmaps?fieldSelector=metadata.name%3Dnetworking-console-plugin&resourceVersion=27269\": dial tcp 38.129.56.13:6443: connect: connection refused" logger="UnhandledError" Mar 19 10:27:34 crc kubenswrapper[4765]: I0319 10:27:34.408422 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"21b57e64a825761753cc0564d285763e7a1ca727df1d1424c60c2491a4ddc326"} Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.418696 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"7c98982ee9425ff73cfafad53cc39b80cc1788853be9e567108d6223ee29bc77"} Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.419168 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"16ac6731d1bf265aa13069bebfb6952e8dbcea8325003e83e81333cb845cef85"} Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.421294 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.426540 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.426613 4765 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550" exitCode=1 Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.426662 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550"} Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.427388 4765 scope.go:117] "RemoveContainer" containerID="7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550" Mar 19 10:27:35 crc kubenswrapper[4765]: I0319 10:27:35.981362 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:27:36 crc kubenswrapper[4765]: I0319 10:27:36.435850 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"009a2a1088e3dfc3c8809f954320a1988eb4ad3597a46798edb2790fd16e0e6e"} Mar 19 10:27:36 crc kubenswrapper[4765]: I0319 10:27:36.435908 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"ecf8c0f7143c9df92fc6f9d24704cf33bcf7e2653161a264bf962b9bab6afd55"} Mar 19 10:27:36 crc kubenswrapper[4765]: I0319 10:27:36.438647 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 10:27:36 crc kubenswrapper[4765]: I0319 10:27:36.439942 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 10:27:36 crc kubenswrapper[4765]: I0319 10:27:36.440004 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"848552a75786f1c4d02cb61a33a18c5eb702c60f041033ef5aaff29131aef710"} Mar 19 10:27:37 crc kubenswrapper[4765]: I0319 10:27:37.446836 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:37 crc kubenswrapper[4765]: I0319 10:27:37.446851 4765 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:37 crc kubenswrapper[4765]: I0319 10:27:37.447481 4765 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:37 crc kubenswrapper[4765]: I0319 10:27:37.456157 4765 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:27:38 crc kubenswrapper[4765]: I0319 10:27:38.453704 4765 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:38 crc kubenswrapper[4765]: I0319 10:27:38.453745 4765 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:27:39 crc kubenswrapper[4765]: I0319 10:27:39.328000 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 19 10:27:40 crc kubenswrapper[4765]: I0319 10:27:40.441022 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 19 10:27:41 crc kubenswrapper[4765]: E0319 10:27:41.380772 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-s2dwl], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 19 10:27:41 crc kubenswrapper[4765]: I0319 10:27:41.507994 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 19 10:27:42 crc kubenswrapper[4765]: I0319 10:27:42.373986 4765 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="36cdbc6a-e01d-4bb8-89eb-856869977e79" Mar 19 10:27:42 crc kubenswrapper[4765]: E0319 10:27:42.393783 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-cqllr], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 19 10:27:42 crc kubenswrapper[4765]: E0319 10:27:42.397044 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert nginx-conf], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 19 10:27:42 crc kubenswrapper[4765]: I0319 10:27:42.591700 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:27:42 crc kubenswrapper[4765]: I0319 10:27:42.786049 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 19 10:27:45 crc kubenswrapper[4765]: I0319 10:27:45.981554 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:27:45 crc kubenswrapper[4765]: I0319 10:27:45.982294 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 10:27:45 crc kubenswrapper[4765]: I0319 10:27:45.982393 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 10:27:50 crc kubenswrapper[4765]: I0319 10:27:50.832380 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 19 10:27:50 crc kubenswrapper[4765]: I0319 10:27:50.939395 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 19 10:27:51 crc kubenswrapper[4765]: I0319 10:27:51.129567 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 19 10:27:51 crc kubenswrapper[4765]: I0319 10:27:51.551433 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 19 10:27:51 crc kubenswrapper[4765]: I0319 10:27:51.983677 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 19 10:27:52 crc kubenswrapper[4765]: I0319 10:27:52.083791 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 19 10:27:52 crc kubenswrapper[4765]: I0319 10:27:52.179342 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 10:27:52 crc kubenswrapper[4765]: I0319 10:27:52.338933 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 19 10:27:52 crc kubenswrapper[4765]: I0319 10:27:52.382541 4765 pod_container_manager_linux.go:210] "Failed to delete cgroup paths" cgroupName=["kubepods","pod68b99dac-eeb2-4875-948b-947030c71066"] err="unable to destroy cgroup paths for cgroup [kubepods pod68b99dac-eeb2-4875-948b-947030c71066] : Timed out while waiting for systemd to remove kubepods-pod68b99dac_eeb2_4875_948b_947030c71066.slice" Mar 19 10:27:52 crc kubenswrapper[4765]: E0319 10:27:52.382629 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to delete cgroup paths for [kubepods pod68b99dac-eeb2-4875-948b-947030c71066] : unable to destroy cgroup paths for cgroup [kubepods pod68b99dac-eeb2-4875-948b-947030c71066] : Timed out while waiting for systemd to remove kubepods-pod68b99dac_eeb2_4875_948b_947030c71066.slice" pod="openshift-kube-apiserver/installer-9-crc" podUID="68b99dac-eeb2-4875-948b-947030c71066" Mar 19 10:27:52 crc kubenswrapper[4765]: I0319 10:27:52.545601 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 19 10:27:52 crc kubenswrapper[4765]: I0319 10:27:52.994024 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.035771 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.295500 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.528342 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.545021 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.577358 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.670482 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.676163 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.696663 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 19 10:27:53 crc kubenswrapper[4765]: I0319 10:27:53.743315 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.056508 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.056908 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.345611 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.355310 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.386591 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.409755 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.604451 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.640187 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.774823 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.778032 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.926136 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 19 10:27:54 crc kubenswrapper[4765]: I0319 10:27:54.972822 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.045751 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.185634 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.207549 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.299607 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.355649 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.355809 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.379615 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.406664 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.409232 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.542022 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.560617 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.581403 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.593702 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.764581 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.789055 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.827562 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.886404 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.982266 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.982370 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 10:27:55 crc kubenswrapper[4765]: I0319 10:27:55.993035 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.020433 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.143111 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.176855 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.307940 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.308168 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.444731 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.476721 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.481821 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.520728 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.539559 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.589865 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.684280 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.701371 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.745295 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.830230 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.958040 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 10:27:56 crc kubenswrapper[4765]: I0319 10:27:56.961867 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.072352 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.073325 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.179850 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.182326 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.188727 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.201544 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.217500 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.316682 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.347019 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.355178 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.428906 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.581928 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.589203 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.644757 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.662996 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.879785 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.909444 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 19 10:27:57 crc kubenswrapper[4765]: I0319 10:27:57.964121 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.066466 4765 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.097990 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.120709 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.302294 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.407565 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.432704 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.462197 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.586903 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.668610 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.739729 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.758589 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.769947 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.813186 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 10:27:58 crc kubenswrapper[4765]: I0319 10:27:58.837031 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.016522 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.028205 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.046706 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.083807 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.262363 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.309861 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.387323 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.388804 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.399669 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.412728 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.440610 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.504590 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.612153 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.677707 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.705808 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.734524 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.858523 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.871098 4765 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 19 10:27:59 crc kubenswrapper[4765]: I0319 10:27:59.956386 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.067673 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.067933 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.154093 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.240002 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.254134 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.364225 4765 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.413892 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.453734 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.454305 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.463661 4765 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.489679 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.580758 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.619125 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.641330 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.648753 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.655015 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.734282 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.773430 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.793060 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.843179 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.866188 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.879525 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.894721 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 19 10:28:00 crc kubenswrapper[4765]: I0319 10:28:00.940526 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.006904 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.095975 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.128598 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.196325 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.218782 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.240850 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.380124 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.461687 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.500341 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.526002 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.535928 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.538717 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.681493 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.688277 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.833221 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.837315 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.879461 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.958035 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 19 10:28:01 crc kubenswrapper[4765]: I0319 10:28:01.961849 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.143691 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.287307 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.426447 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.433667 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.440630 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.485902 4765 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.490942 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-8sk6m"] Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.491030 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-6d89bff689-sd9zm","openshift-kube-apiserver/kube-apiserver-crc"] Mar 19 10:28:02 crc kubenswrapper[4765]: E0319 10:28:02.491251 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" containerName="oauth-openshift" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.491275 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" containerName="oauth-openshift" Mar 19 10:28:02 crc kubenswrapper[4765]: E0319 10:28:02.491299 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68b99dac-eeb2-4875-948b-947030c71066" containerName="installer" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.491308 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="68b99dac-eeb2-4875-948b-947030c71066" containerName="installer" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.491414 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" containerName="oauth-openshift" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.491428 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="68b99dac-eeb2-4875-948b-947030c71066" containerName="installer" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.491911 4765 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.492029 4765 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="006f7d04-2c90-47e9-983d-45318e2fc84e" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.491990 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.494639 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.494953 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.497856 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.498178 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.498182 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.498404 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.498551 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.498584 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.498745 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.499214 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.499634 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.499814 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.500301 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.500926 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.510325 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.511586 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.518436 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=25.518413948 podStartE2EDuration="25.518413948s" podCreationTimestamp="2026-03-19 10:27:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:28:02.515171291 +0000 UTC m=+380.864116843" watchObservedRunningTime="2026-03-19 10:28:02.518413948 +0000 UTC m=+380.867359490" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.518943 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.576798 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599074 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599369 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599513 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-audit-policies\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599617 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599744 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599791 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-error\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599824 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-session\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599848 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-md547\" (UniqueName: \"kubernetes.io/projected/6a5472c3-8592-48c1-982d-b28003141bc0-kube-api-access-md547\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.599882 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.600009 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.600139 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.600244 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-login\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.600402 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a5472c3-8592-48c1-982d-b28003141bc0-audit-dir\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.600524 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.685876 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.701917 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.702379 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.702503 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-audit-policies\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.702596 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.702746 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.702869 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-error\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703003 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-session\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703110 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-md547\" (UniqueName: \"kubernetes.io/projected/6a5472c3-8592-48c1-982d-b28003141bc0-kube-api-access-md547\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703219 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703299 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703389 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703489 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-login\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703575 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a5472c3-8592-48c1-982d-b28003141bc0-audit-dir\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.703656 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.706330 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-service-ca\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.706655 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-audit-policies\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.707183 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.707632 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-cliconfig\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.709139 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/6a5472c3-8592-48c1-982d-b28003141bc0-audit-dir\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.711130 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-serving-cert\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.714788 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-router-certs\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.720803 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.721182 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-login\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.721547 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-session\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.721662 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.722686 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-error\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.724346 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.724779 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-md547\" (UniqueName: \"kubernetes.io/projected/6a5472c3-8592-48c1-982d-b28003141bc0-kube-api-access-md547\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.725149 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/6a5472c3-8592-48c1-982d-b28003141bc0-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-6d89bff689-sd9zm\" (UID: \"6a5472c3-8592-48c1-982d-b28003141bc0\") " pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.779139 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.813147 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.815412 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.847166 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.865829 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 19 10:28:02 crc kubenswrapper[4765]: I0319 10:28:02.986857 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.024662 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.040154 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.076421 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.250743 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.258156 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.287472 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.327467 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.346253 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d89bff689-sd9zm"] Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.417059 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.436016 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.477490 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.515663 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.579250 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.596506 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.646122 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.749437 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.779432 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.875469 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-6d89bff689-sd9zm"] Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.886244 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 19 10:28:03 crc kubenswrapper[4765]: I0319 10:28:03.944949 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.040547 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.065833 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.078166 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.130918 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.166244 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.267029 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.328484 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.364466 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="331c5a49-dffb-4c14-ab1b-1b41bfd8f09f" path="/var/lib/kubelet/pods/331c5a49-dffb-4c14-ab1b-1b41bfd8f09f/volumes" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.373096 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.384860 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.611110 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" event={"ID":"6a5472c3-8592-48c1-982d-b28003141bc0","Type":"ContainerStarted","Data":"bffc347216c0fdbf50a5490e903020bebcbbc69bdc85b9cdd5d531a87b2c5ba6"} Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.611169 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" event={"ID":"6a5472c3-8592-48c1-982d-b28003141bc0","Type":"ContainerStarted","Data":"edc88c882251f9f87a0abe664c3e45e2eae30fa5448cdedb11e5401823abe07a"} Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.611392 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.618873 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.638768 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-6d89bff689-sd9zm" podStartSLOduration=68.638747685 podStartE2EDuration="1m8.638747685s" podCreationTimestamp="2026-03-19 10:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:28:04.636241588 +0000 UTC m=+382.985187140" watchObservedRunningTime="2026-03-19 10:28:04.638747685 +0000 UTC m=+382.987693237" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.640330 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.641060 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.652766 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.677216 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.691207 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.889833 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.912771 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.917290 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.922477 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.935157 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.966364 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 19 10:28:04 crc kubenswrapper[4765]: I0319 10:28:04.993722 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.102945 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.103288 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.166190 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.343222 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.345487 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.401875 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.429336 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.521687 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.782774 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.802001 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.932428 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.961143 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.982185 4765 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.982295 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.982377 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.983333 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"848552a75786f1c4d02cb61a33a18c5eb702c60f041033ef5aaff29131aef710"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 19 10:28:05 crc kubenswrapper[4765]: I0319 10:28:05.983648 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://848552a75786f1c4d02cb61a33a18c5eb702c60f041033ef5aaff29131aef710" gracePeriod=30 Mar 19 10:28:06 crc kubenswrapper[4765]: I0319 10:28:06.391145 4765 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 19 10:28:06 crc kubenswrapper[4765]: I0319 10:28:06.476733 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 19 10:28:06 crc kubenswrapper[4765]: I0319 10:28:06.540017 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 19 10:28:06 crc kubenswrapper[4765]: I0319 10:28:06.878801 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.028825 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.380105 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.392455 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.392527 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.397094 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.422266 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.644289 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.654109 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 19 10:28:07 crc kubenswrapper[4765]: I0319 10:28:07.957731 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 10:28:08 crc kubenswrapper[4765]: I0319 10:28:08.083376 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 19 10:28:08 crc kubenswrapper[4765]: I0319 10:28:08.095857 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 19 10:28:13 crc kubenswrapper[4765]: I0319 10:28:13.203495 4765 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 19 10:28:13 crc kubenswrapper[4765]: I0319 10:28:13.204283 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67" gracePeriod=5 Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.370413 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.370531 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.549992 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550438 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550175 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550540 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550639 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550719 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550761 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550844 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.550946 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.551350 4765 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.551773 4765 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.552027 4765 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.552090 4765 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.563873 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.653367 4765 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.703481 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.703531 4765 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67" exitCode=137 Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.703580 4765 scope.go:117] "RemoveContainer" containerID="7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.703714 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.724542 4765 scope.go:117] "RemoveContainer" containerID="7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67" Mar 19 10:28:18 crc kubenswrapper[4765]: E0319 10:28:18.725618 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67\": container with ID starting with 7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67 not found: ID does not exist" containerID="7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67" Mar 19 10:28:18 crc kubenswrapper[4765]: I0319 10:28:18.725656 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67"} err="failed to get container status \"7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67\": rpc error: code = NotFound desc = could not find container \"7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67\": container with ID starting with 7e9a8538b929deedc2a68d00c4bcdc436377d08c6e7f154372ec51c1ac348b67 not found: ID does not exist" Mar 19 10:28:20 crc kubenswrapper[4765]: I0319 10:28:20.363550 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 19 10:28:23 crc kubenswrapper[4765]: I0319 10:28:23.736751 4765 generic.go:334] "Generic (PLEG): container finished" podID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerID="d5157a82387bb5080cd401571c48cc1a277e4ccbfe656a5a755dd8c507080b2a" exitCode=0 Mar 19 10:28:23 crc kubenswrapper[4765]: I0319 10:28:23.736833 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" event={"ID":"f27f5c72-19c7-4d66-b927-0eae532ff4fe","Type":"ContainerDied","Data":"d5157a82387bb5080cd401571c48cc1a277e4ccbfe656a5a755dd8c507080b2a"} Mar 19 10:28:23 crc kubenswrapper[4765]: I0319 10:28:23.737878 4765 scope.go:117] "RemoveContainer" containerID="d5157a82387bb5080cd401571c48cc1a277e4ccbfe656a5a755dd8c507080b2a" Mar 19 10:28:24 crc kubenswrapper[4765]: I0319 10:28:24.745635 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" event={"ID":"f27f5c72-19c7-4d66-b927-0eae532ff4fe","Type":"ContainerStarted","Data":"a33e13ece53709b3a00decabb15c5aed0fd61ea1b0da2fee91c5128199dc861c"} Mar 19 10:28:24 crc kubenswrapper[4765]: I0319 10:28:24.747207 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:28:24 crc kubenswrapper[4765]: I0319 10:28:24.750784 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:28:36 crc kubenswrapper[4765]: I0319 10:28:36.820715 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 10:28:36 crc kubenswrapper[4765]: I0319 10:28:36.822851 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 10:28:36 crc kubenswrapper[4765]: I0319 10:28:36.824063 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 19 10:28:36 crc kubenswrapper[4765]: I0319 10:28:36.824140 4765 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="848552a75786f1c4d02cb61a33a18c5eb702c60f041033ef5aaff29131aef710" exitCode=137 Mar 19 10:28:36 crc kubenswrapper[4765]: I0319 10:28:36.824187 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"848552a75786f1c4d02cb61a33a18c5eb702c60f041033ef5aaff29131aef710"} Mar 19 10:28:36 crc kubenswrapper[4765]: I0319 10:28:36.824238 4765 scope.go:117] "RemoveContainer" containerID="7e788c4b7d579a7df5887b915a34e162d4961a7767a1c54e6d9fcc3c2917d550" Mar 19 10:28:37 crc kubenswrapper[4765]: I0319 10:28:37.832938 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 19 10:28:37 crc kubenswrapper[4765]: I0319 10:28:37.834240 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/1.log" Mar 19 10:28:37 crc kubenswrapper[4765]: I0319 10:28:37.835090 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ed889487637112e2fbd6030b7b8f5a29cf68177fb9fe3e5c27137f3e0415f716"} Mar 19 10:28:42 crc kubenswrapper[4765]: I0319 10:28:42.591619 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:28:45 crc kubenswrapper[4765]: I0319 10:28:45.981790 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:28:45 crc kubenswrapper[4765]: I0319 10:28:45.992624 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:28:46 crc kubenswrapper[4765]: I0319 10:28:46.887622 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.720795 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56797d6658-v62bz"] Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.721756 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" podUID="a791f1a0-5735-45b7-b984-5299cca0c90c" containerName="controller-manager" containerID="cri-o://a87e1125030372fe0ba2f835a7217bf97ae19fcaeed4f5ec7f891b71d1e1b0aa" gracePeriod=30 Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.727225 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8"] Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.727458 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" podUID="a8e40efe-3ec1-479d-a2ca-53f44efc838e" containerName="route-controller-manager" containerID="cri-o://1723156ef706dca453a08f5dd0237fd21b38381d71d874316bbd5fd9a4574ae5" gracePeriod=30 Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.748863 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565268-t2x8x"] Mar 19 10:28:53 crc kubenswrapper[4765]: E0319 10:28:53.749218 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.749236 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.749390 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.749927 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-t2x8x" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.752317 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.753334 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.756064 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.771495 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-t2x8x"] Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.890839 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27w7r\" (UniqueName: \"kubernetes.io/projected/6237c3c3-e25e-4b5d-8b7a-66198a313195-kube-api-access-27w7r\") pod \"auto-csr-approver-29565268-t2x8x\" (UID: \"6237c3c3-e25e-4b5d-8b7a-66198a313195\") " pod="openshift-infra/auto-csr-approver-29565268-t2x8x" Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.943071 4765 generic.go:334] "Generic (PLEG): container finished" podID="a791f1a0-5735-45b7-b984-5299cca0c90c" containerID="a87e1125030372fe0ba2f835a7217bf97ae19fcaeed4f5ec7f891b71d1e1b0aa" exitCode=0 Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.943194 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" event={"ID":"a791f1a0-5735-45b7-b984-5299cca0c90c","Type":"ContainerDied","Data":"a87e1125030372fe0ba2f835a7217bf97ae19fcaeed4f5ec7f891b71d1e1b0aa"} Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.952786 4765 generic.go:334] "Generic (PLEG): container finished" podID="a8e40efe-3ec1-479d-a2ca-53f44efc838e" containerID="1723156ef706dca453a08f5dd0237fd21b38381d71d874316bbd5fd9a4574ae5" exitCode=0 Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.952848 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" event={"ID":"a8e40efe-3ec1-479d-a2ca-53f44efc838e","Type":"ContainerDied","Data":"1723156ef706dca453a08f5dd0237fd21b38381d71d874316bbd5fd9a4574ae5"} Mar 19 10:28:53 crc kubenswrapper[4765]: I0319 10:28:53.992786 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27w7r\" (UniqueName: \"kubernetes.io/projected/6237c3c3-e25e-4b5d-8b7a-66198a313195-kube-api-access-27w7r\") pod \"auto-csr-approver-29565268-t2x8x\" (UID: \"6237c3c3-e25e-4b5d-8b7a-66198a313195\") " pod="openshift-infra/auto-csr-approver-29565268-t2x8x" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.019149 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27w7r\" (UniqueName: \"kubernetes.io/projected/6237c3c3-e25e-4b5d-8b7a-66198a313195-kube-api-access-27w7r\") pod \"auto-csr-approver-29565268-t2x8x\" (UID: \"6237c3c3-e25e-4b5d-8b7a-66198a313195\") " pod="openshift-infra/auto-csr-approver-29565268-t2x8x" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.070137 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-t2x8x" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.223513 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.321515 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.398322 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-client-ca\") pod \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.398444 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-config\") pod \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.398494 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e40efe-3ec1-479d-a2ca-53f44efc838e-serving-cert\") pod \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.398557 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cbrj2\" (UniqueName: \"kubernetes.io/projected/a8e40efe-3ec1-479d-a2ca-53f44efc838e-kube-api-access-cbrj2\") pod \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\" (UID: \"a8e40efe-3ec1-479d-a2ca-53f44efc838e\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.399579 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-client-ca" (OuterVolumeSpecName: "client-ca") pod "a8e40efe-3ec1-479d-a2ca-53f44efc838e" (UID: "a8e40efe-3ec1-479d-a2ca-53f44efc838e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.399888 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-config" (OuterVolumeSpecName: "config") pod "a8e40efe-3ec1-479d-a2ca-53f44efc838e" (UID: "a8e40efe-3ec1-479d-a2ca-53f44efc838e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.405100 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8e40efe-3ec1-479d-a2ca-53f44efc838e-kube-api-access-cbrj2" (OuterVolumeSpecName: "kube-api-access-cbrj2") pod "a8e40efe-3ec1-479d-a2ca-53f44efc838e" (UID: "a8e40efe-3ec1-479d-a2ca-53f44efc838e"). InnerVolumeSpecName "kube-api-access-cbrj2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.405684 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8e40efe-3ec1-479d-a2ca-53f44efc838e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a8e40efe-3ec1-479d-a2ca-53f44efc838e" (UID: "a8e40efe-3ec1-479d-a2ca-53f44efc838e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.499897 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-proxy-ca-bundles\") pod \"a791f1a0-5735-45b7-b984-5299cca0c90c\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500107 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsdjp\" (UniqueName: \"kubernetes.io/projected/a791f1a0-5735-45b7-b984-5299cca0c90c-kube-api-access-rsdjp\") pod \"a791f1a0-5735-45b7-b984-5299cca0c90c\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500145 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-config\") pod \"a791f1a0-5735-45b7-b984-5299cca0c90c\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500204 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-client-ca\") pod \"a791f1a0-5735-45b7-b984-5299cca0c90c\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500245 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a791f1a0-5735-45b7-b984-5299cca0c90c-serving-cert\") pod \"a791f1a0-5735-45b7-b984-5299cca0c90c\" (UID: \"a791f1a0-5735-45b7-b984-5299cca0c90c\") " Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500600 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500645 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a8e40efe-3ec1-479d-a2ca-53f44efc838e-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500661 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a8e40efe-3ec1-479d-a2ca-53f44efc838e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.500673 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cbrj2\" (UniqueName: \"kubernetes.io/projected/a8e40efe-3ec1-479d-a2ca-53f44efc838e-kube-api-access-cbrj2\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.501239 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-client-ca" (OuterVolumeSpecName: "client-ca") pod "a791f1a0-5735-45b7-b984-5299cca0c90c" (UID: "a791f1a0-5735-45b7-b984-5299cca0c90c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.501700 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-config" (OuterVolumeSpecName: "config") pod "a791f1a0-5735-45b7-b984-5299cca0c90c" (UID: "a791f1a0-5735-45b7-b984-5299cca0c90c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.502133 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "a791f1a0-5735-45b7-b984-5299cca0c90c" (UID: "a791f1a0-5735-45b7-b984-5299cca0c90c"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.504707 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a791f1a0-5735-45b7-b984-5299cca0c90c-kube-api-access-rsdjp" (OuterVolumeSpecName: "kube-api-access-rsdjp") pod "a791f1a0-5735-45b7-b984-5299cca0c90c" (UID: "a791f1a0-5735-45b7-b984-5299cca0c90c"). InnerVolumeSpecName "kube-api-access-rsdjp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.505324 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a791f1a0-5735-45b7-b984-5299cca0c90c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "a791f1a0-5735-45b7-b984-5299cca0c90c" (UID: "a791f1a0-5735-45b7-b984-5299cca0c90c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.602168 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsdjp\" (UniqueName: \"kubernetes.io/projected/a791f1a0-5735-45b7-b984-5299cca0c90c-kube-api-access-rsdjp\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.602222 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.602235 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.602249 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a791f1a0-5735-45b7-b984-5299cca0c90c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.602261 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/a791f1a0-5735-45b7-b984-5299cca0c90c-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.609397 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-t2x8x"] Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.962826 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" event={"ID":"a791f1a0-5735-45b7-b984-5299cca0c90c","Type":"ContainerDied","Data":"3150bd30b5d29a5563fc3f11592ed9b8237d114a8d5b12eb8ce287e72a22e9e4"} Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.962878 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-56797d6658-v62bz" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.962899 4765 scope.go:117] "RemoveContainer" containerID="a87e1125030372fe0ba2f835a7217bf97ae19fcaeed4f5ec7f891b71d1e1b0aa" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.965246 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.965234 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8" event={"ID":"a8e40efe-3ec1-479d-a2ca-53f44efc838e","Type":"ContainerDied","Data":"d806782ff03c981527b18466aa3d72cf17f9cfd409066871f13c2bc1e848b048"} Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.972202 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565268-t2x8x" event={"ID":"6237c3c3-e25e-4b5d-8b7a-66198a313195","Type":"ContainerStarted","Data":"c957c7535a30a7d47d6811720d70ecae3c7a5c0987f8eb9b8bd520ece00378e1"} Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.982561 4765 scope.go:117] "RemoveContainer" containerID="1723156ef706dca453a08f5dd0237fd21b38381d71d874316bbd5fd9a4574ae5" Mar 19 10:28:54 crc kubenswrapper[4765]: I0319 10:28:54.997813 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.000733 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-5db9d4cbbf-wg8x8"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.009994 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-56797d6658-v62bz"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.013622 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-56797d6658-v62bz"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.116892 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl"] Mar 19 10:28:55 crc kubenswrapper[4765]: E0319 10:28:55.117194 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8e40efe-3ec1-479d-a2ca-53f44efc838e" containerName="route-controller-manager" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.117314 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8e40efe-3ec1-479d-a2ca-53f44efc838e" containerName="route-controller-manager" Mar 19 10:28:55 crc kubenswrapper[4765]: E0319 10:28:55.117345 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a791f1a0-5735-45b7-b984-5299cca0c90c" containerName="controller-manager" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.117353 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a791f1a0-5735-45b7-b984-5299cca0c90c" containerName="controller-manager" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.117502 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8e40efe-3ec1-479d-a2ca-53f44efc838e" containerName="route-controller-manager" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.117525 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a791f1a0-5735-45b7-b984-5299cca0c90c" containerName="controller-manager" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.118064 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.119934 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.120323 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.120729 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.120774 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.122150 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6c54555755-wx24k"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.122424 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.123167 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.127036 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.127112 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.127549 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.127831 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.128024 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.128513 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.130887 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c54555755-wx24k"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.131019 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.133671 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.136380 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216145 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-config\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216272 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-htvsl\" (UniqueName: \"kubernetes.io/projected/ab25744c-43c6-4303-9dd4-048b634054b6-kube-api-access-htvsl\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216317 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab25744c-43c6-4303-9dd4-048b634054b6-serving-cert\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216338 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-client-ca\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216397 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-client-ca\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216428 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-config\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216446 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qr4wr\" (UniqueName: \"kubernetes.io/projected/5d2ee826-6d61-492b-a791-443185dc78e6-kube-api-access-qr4wr\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216496 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-proxy-ca-bundles\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.216525 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d2ee826-6d61-492b-a791-443185dc78e6-serving-cert\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317450 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-htvsl\" (UniqueName: \"kubernetes.io/projected/ab25744c-43c6-4303-9dd4-048b634054b6-kube-api-access-htvsl\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317515 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab25744c-43c6-4303-9dd4-048b634054b6-serving-cert\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317546 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-client-ca\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317596 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-client-ca\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317635 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-config\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317655 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qr4wr\" (UniqueName: \"kubernetes.io/projected/5d2ee826-6d61-492b-a791-443185dc78e6-kube-api-access-qr4wr\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317689 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-proxy-ca-bundles\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317712 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d2ee826-6d61-492b-a791-443185dc78e6-serving-cert\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.317745 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-config\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.318900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-client-ca\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.319295 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-client-ca\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.319378 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-config\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.319633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-config\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.319909 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-proxy-ca-bundles\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.323533 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab25744c-43c6-4303-9dd4-048b634054b6-serving-cert\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.325321 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d2ee826-6d61-492b-a791-443185dc78e6-serving-cert\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.336173 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qr4wr\" (UniqueName: \"kubernetes.io/projected/5d2ee826-6d61-492b-a791-443185dc78e6-kube-api-access-qr4wr\") pod \"controller-manager-6c54555755-wx24k\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.336537 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-htvsl\" (UniqueName: \"kubernetes.io/projected/ab25744c-43c6-4303-9dd4-048b634054b6-kube-api-access-htvsl\") pod \"route-controller-manager-6f9c758cfc-d5rzl\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.446094 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.455212 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.980450 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl"] Mar 19 10:28:55 crc kubenswrapper[4765]: I0319 10:28:55.986080 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6c54555755-wx24k"] Mar 19 10:28:55 crc kubenswrapper[4765]: W0319 10:28:55.993490 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab25744c_43c6_4303_9dd4_048b634054b6.slice/crio-d5eb50f8c87c4f6bfb1bd2d69d5725ec5fd31036d542c7912d4697d6c4190363 WatchSource:0}: Error finding container d5eb50f8c87c4f6bfb1bd2d69d5725ec5fd31036d542c7912d4697d6c4190363: Status 404 returned error can't find the container with id d5eb50f8c87c4f6bfb1bd2d69d5725ec5fd31036d542c7912d4697d6c4190363 Mar 19 10:28:56 crc kubenswrapper[4765]: I0319 10:28:56.363596 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a791f1a0-5735-45b7-b984-5299cca0c90c" path="/var/lib/kubelet/pods/a791f1a0-5735-45b7-b984-5299cca0c90c/volumes" Mar 19 10:28:56 crc kubenswrapper[4765]: I0319 10:28:56.364836 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8e40efe-3ec1-479d-a2ca-53f44efc838e" path="/var/lib/kubelet/pods/a8e40efe-3ec1-479d-a2ca-53f44efc838e/volumes" Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.004498 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" event={"ID":"5d2ee826-6d61-492b-a791-443185dc78e6","Type":"ContainerStarted","Data":"bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f"} Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.004563 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" event={"ID":"5d2ee826-6d61-492b-a791-443185dc78e6","Type":"ContainerStarted","Data":"26bf85529175a64f207e77dbaf5f5c3b6d6470ac0eb00fb176c01fe33f29855b"} Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.005225 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.015036 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.017004 4765 generic.go:334] "Generic (PLEG): container finished" podID="6237c3c3-e25e-4b5d-8b7a-66198a313195" containerID="14a43eed4dfe0f9e563278b73ce4359443e0ce2531c928ab9e6ade729c8a7be6" exitCode=0 Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.017047 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565268-t2x8x" event={"ID":"6237c3c3-e25e-4b5d-8b7a-66198a313195","Type":"ContainerDied","Data":"14a43eed4dfe0f9e563278b73ce4359443e0ce2531c928ab9e6ade729c8a7be6"} Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.019082 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" event={"ID":"ab25744c-43c6-4303-9dd4-048b634054b6","Type":"ContainerStarted","Data":"91fe485b8c9e70039e92cd64618ac48ec8c86780ab6016d0913dda548ac314fc"} Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.019124 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" event={"ID":"ab25744c-43c6-4303-9dd4-048b634054b6","Type":"ContainerStarted","Data":"d5eb50f8c87c4f6bfb1bd2d69d5725ec5fd31036d542c7912d4697d6c4190363"} Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.019509 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.025430 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.035134 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" podStartSLOduration=4.035103545 podStartE2EDuration="4.035103545s" podCreationTimestamp="2026-03-19 10:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:28:57.03380885 +0000 UTC m=+435.382754402" watchObservedRunningTime="2026-03-19 10:28:57.035103545 +0000 UTC m=+435.384049087" Mar 19 10:28:57 crc kubenswrapper[4765]: I0319 10:28:57.094714 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" podStartSLOduration=4.094688244 podStartE2EDuration="4.094688244s" podCreationTimestamp="2026-03-19 10:28:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:28:57.093903512 +0000 UTC m=+435.442849054" watchObservedRunningTime="2026-03-19 10:28:57.094688244 +0000 UTC m=+435.443633786" Mar 19 10:28:58 crc kubenswrapper[4765]: I0319 10:28:58.314572 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-t2x8x" Mar 19 10:28:58 crc kubenswrapper[4765]: I0319 10:28:58.364722 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27w7r\" (UniqueName: \"kubernetes.io/projected/6237c3c3-e25e-4b5d-8b7a-66198a313195-kube-api-access-27w7r\") pod \"6237c3c3-e25e-4b5d-8b7a-66198a313195\" (UID: \"6237c3c3-e25e-4b5d-8b7a-66198a313195\") " Mar 19 10:28:58 crc kubenswrapper[4765]: I0319 10:28:58.377205 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6237c3c3-e25e-4b5d-8b7a-66198a313195-kube-api-access-27w7r" (OuterVolumeSpecName: "kube-api-access-27w7r") pod "6237c3c3-e25e-4b5d-8b7a-66198a313195" (UID: "6237c3c3-e25e-4b5d-8b7a-66198a313195"). InnerVolumeSpecName "kube-api-access-27w7r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:28:58 crc kubenswrapper[4765]: I0319 10:28:58.466593 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27w7r\" (UniqueName: \"kubernetes.io/projected/6237c3c3-e25e-4b5d-8b7a-66198a313195-kube-api-access-27w7r\") on node \"crc\" DevicePath \"\"" Mar 19 10:28:59 crc kubenswrapper[4765]: I0319 10:28:59.034774 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565268-t2x8x" event={"ID":"6237c3c3-e25e-4b5d-8b7a-66198a313195","Type":"ContainerDied","Data":"c957c7535a30a7d47d6811720d70ecae3c7a5c0987f8eb9b8bd520ece00378e1"} Mar 19 10:28:59 crc kubenswrapper[4765]: I0319 10:28:59.034829 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c957c7535a30a7d47d6811720d70ecae3c7a5c0987f8eb9b8bd520ece00378e1" Mar 19 10:28:59 crc kubenswrapper[4765]: I0319 10:28:59.035466 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565268-t2x8x" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.407285 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdfk6"] Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.408489 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-rdfk6" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="registry-server" containerID="cri-o://80c26f1af13018b4e3fd24cb42eff50ad38ed62a68f1711275377a2ad9a000b7" gracePeriod=30 Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.413941 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84nts"] Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.414369 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-84nts" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="registry-server" containerID="cri-o://89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f" gracePeriod=30 Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.428326 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpdr"] Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.428830 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" containerID="cri-o://a33e13ece53709b3a00decabb15c5aed0fd61ea1b0da2fee91c5128199dc861c" gracePeriod=30 Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.443908 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mm2f"] Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.444394 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-8mm2f" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="registry-server" containerID="cri-o://c08c2b344ee219e96638d1552650b343b613249f27a8daec26b6418f5bbefa65" gracePeriod=30 Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.458051 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ddtf"] Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.458476 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-9ddtf" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="registry-server" containerID="cri-o://722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9" gracePeriod=30 Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.461562 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6xvzl"] Mar 19 10:29:07 crc kubenswrapper[4765]: E0319 10:29:07.461883 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6237c3c3-e25e-4b5d-8b7a-66198a313195" containerName="oc" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.461909 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6237c3c3-e25e-4b5d-8b7a-66198a313195" containerName="oc" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.462055 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6237c3c3-e25e-4b5d-8b7a-66198a313195" containerName="oc" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.467678 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.483587 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6xvzl"] Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.520890 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.521244 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqpkl\" (UniqueName: \"kubernetes.io/projected/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-kube-api-access-jqpkl\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.521279 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.623321 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.625016 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jqpkl\" (UniqueName: \"kubernetes.io/projected/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-kube-api-access-jqpkl\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.625079 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.627128 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.636833 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.648087 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqpkl\" (UniqueName: \"kubernetes.io/projected/5e3d8e97-79f8-43d2-acf6-f20ef33cadd3-kube-api-access-jqpkl\") pod \"marketplace-operator-79b997595-6xvzl\" (UID: \"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3\") " pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:07 crc kubenswrapper[4765]: I0319 10:29:07.923499 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.026461 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.107910 4765 generic.go:334] "Generic (PLEG): container finished" podID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerID="89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f" exitCode=0 Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.108307 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84nts" event={"ID":"0e281996-1607-4eab-a87f-f4434f4dd17a","Type":"ContainerDied","Data":"89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f"} Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.108378 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-84nts" event={"ID":"0e281996-1607-4eab-a87f-f4434f4dd17a","Type":"ContainerDied","Data":"194763c947fddaa6e5095ed5c794da435f48b0a727ddbe0576af3fec2fe414bf"} Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.108395 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-84nts" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.108408 4765 scope.go:117] "RemoveContainer" containerID="89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.133404 4765 generic.go:334] "Generic (PLEG): container finished" podID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerID="c08c2b344ee219e96638d1552650b343b613249f27a8daec26b6418f5bbefa65" exitCode=0 Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.133493 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mm2f" event={"ID":"49e1c321-1087-47b4-a9ef-446e4cef558e","Type":"ContainerDied","Data":"c08c2b344ee219e96638d1552650b343b613249f27a8daec26b6418f5bbefa65"} Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.141938 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-utilities\") pod \"0e281996-1607-4eab-a87f-f4434f4dd17a\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.142067 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7n6x4\" (UniqueName: \"kubernetes.io/projected/0e281996-1607-4eab-a87f-f4434f4dd17a-kube-api-access-7n6x4\") pod \"0e281996-1607-4eab-a87f-f4434f4dd17a\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.142171 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-catalog-content\") pod \"0e281996-1607-4eab-a87f-f4434f4dd17a\" (UID: \"0e281996-1607-4eab-a87f-f4434f4dd17a\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.147804 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-utilities" (OuterVolumeSpecName: "utilities") pod "0e281996-1607-4eab-a87f-f4434f4dd17a" (UID: "0e281996-1607-4eab-a87f-f4434f4dd17a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.156705 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0e281996-1607-4eab-a87f-f4434f4dd17a-kube-api-access-7n6x4" (OuterVolumeSpecName: "kube-api-access-7n6x4") pod "0e281996-1607-4eab-a87f-f4434f4dd17a" (UID: "0e281996-1607-4eab-a87f-f4434f4dd17a"). InnerVolumeSpecName "kube-api-access-7n6x4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.165162 4765 scope.go:117] "RemoveContainer" containerID="fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.165503 4765 generic.go:334] "Generic (PLEG): container finished" podID="107a869c-7528-417c-a633-e775a88a3cea" containerID="722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9" exitCode=0 Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.165581 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ddtf" event={"ID":"107a869c-7528-417c-a633-e775a88a3cea","Type":"ContainerDied","Data":"722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9"} Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.186792 4765 generic.go:334] "Generic (PLEG): container finished" podID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerID="a33e13ece53709b3a00decabb15c5aed0fd61ea1b0da2fee91c5128199dc861c" exitCode=0 Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.187134 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" event={"ID":"f27f5c72-19c7-4d66-b927-0eae532ff4fe","Type":"ContainerDied","Data":"a33e13ece53709b3a00decabb15c5aed0fd61ea1b0da2fee91c5128199dc861c"} Mar 19 10:29:08 crc kubenswrapper[4765]: E0319 10:29:08.205154 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9 is running failed: container process not found" containerID="722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.212567 4765 scope.go:117] "RemoveContainer" containerID="774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7" Mar 19 10:29:08 crc kubenswrapper[4765]: E0319 10:29:08.218243 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9 is running failed: container process not found" containerID="722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.220173 4765 generic.go:334] "Generic (PLEG): container finished" podID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerID="80c26f1af13018b4e3fd24cb42eff50ad38ed62a68f1711275377a2ad9a000b7" exitCode=0 Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.220226 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdfk6" event={"ID":"bc196990-77bd-4e55-9380-1fa14ec297bf","Type":"ContainerDied","Data":"80c26f1af13018b4e3fd24cb42eff50ad38ed62a68f1711275377a2ad9a000b7"} Mar 19 10:29:08 crc kubenswrapper[4765]: E0319 10:29:08.223038 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9 is running failed: container process not found" containerID="722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9" cmd=["grpc_health_probe","-addr=:50051"] Mar 19 10:29:08 crc kubenswrapper[4765]: E0319 10:29:08.223082 4765 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9 is running failed: container process not found" probeType="Readiness" pod="openshift-marketplace/redhat-operators-9ddtf" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="registry-server" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.244419 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0e281996-1607-4eab-a87f-f4434f4dd17a" (UID: "0e281996-1607-4eab-a87f-f4434f4dd17a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.244877 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.244896 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0e281996-1607-4eab-a87f-f4434f4dd17a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.244910 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7n6x4\" (UniqueName: \"kubernetes.io/projected/0e281996-1607-4eab-a87f-f4434f4dd17a-kube-api-access-7n6x4\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.254263 4765 scope.go:117] "RemoveContainer" containerID="89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f" Mar 19 10:29:08 crc kubenswrapper[4765]: E0319 10:29:08.273216 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f\": container with ID starting with 89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f not found: ID does not exist" containerID="89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.273311 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f"} err="failed to get container status \"89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f\": rpc error: code = NotFound desc = could not find container \"89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f\": container with ID starting with 89126b4470441acf75af9e9f53fd70815ee12798791b6309bfe153af1bfd8a8f not found: ID does not exist" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.273376 4765 scope.go:117] "RemoveContainer" containerID="fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca" Mar 19 10:29:08 crc kubenswrapper[4765]: E0319 10:29:08.274704 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca\": container with ID starting with fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca not found: ID does not exist" containerID="fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.274734 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca"} err="failed to get container status \"fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca\": rpc error: code = NotFound desc = could not find container \"fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca\": container with ID starting with fbc6cd53ad3e1a8d0dbd311410454858404a71579b871a2169272979b1e546ca not found: ID does not exist" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.274759 4765 scope.go:117] "RemoveContainer" containerID="774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7" Mar 19 10:29:08 crc kubenswrapper[4765]: E0319 10:29:08.275265 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7\": container with ID starting with 774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7 not found: ID does not exist" containerID="774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.275322 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7"} err="failed to get container status \"774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7\": rpc error: code = NotFound desc = could not find container \"774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7\": container with ID starting with 774b9113a441c342938c320af29040ecd783d4581144aaf4bc1cfe0cff8765f7 not found: ID does not exist" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.275347 4765 scope.go:117] "RemoveContainer" containerID="d5157a82387bb5080cd401571c48cc1a277e4ccbfe656a5a755dd8c507080b2a" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.340393 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.377206 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.381727 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.390332 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.447813 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-trusted-ca\") pod \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.447874 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qv5lb\" (UniqueName: \"kubernetes.io/projected/107a869c-7528-417c-a633-e775a88a3cea-kube-api-access-qv5lb\") pod \"107a869c-7528-417c-a633-e775a88a3cea\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448028 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-utilities\") pod \"49e1c321-1087-47b4-a9ef-446e4cef558e\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448055 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-operator-metrics\") pod \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448078 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-utilities\") pod \"bc196990-77bd-4e55-9380-1fa14ec297bf\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448104 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-catalog-content\") pod \"bc196990-77bd-4e55-9380-1fa14ec297bf\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448126 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-catalog-content\") pod \"107a869c-7528-417c-a633-e775a88a3cea\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448150 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mcpb\" (UniqueName: \"kubernetes.io/projected/f27f5c72-19c7-4d66-b927-0eae532ff4fe-kube-api-access-4mcpb\") pod \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\" (UID: \"f27f5c72-19c7-4d66-b927-0eae532ff4fe\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448181 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9s2p5\" (UniqueName: \"kubernetes.io/projected/bc196990-77bd-4e55-9380-1fa14ec297bf-kube-api-access-9s2p5\") pod \"bc196990-77bd-4e55-9380-1fa14ec297bf\" (UID: \"bc196990-77bd-4e55-9380-1fa14ec297bf\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448218 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-utilities\") pod \"107a869c-7528-417c-a633-e775a88a3cea\" (UID: \"107a869c-7528-417c-a633-e775a88a3cea\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448241 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l6zzm\" (UniqueName: \"kubernetes.io/projected/49e1c321-1087-47b4-a9ef-446e4cef558e-kube-api-access-l6zzm\") pod \"49e1c321-1087-47b4-a9ef-446e4cef558e\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.448271 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-catalog-content\") pod \"49e1c321-1087-47b4-a9ef-446e4cef558e\" (UID: \"49e1c321-1087-47b4-a9ef-446e4cef558e\") " Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.449619 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-utilities" (OuterVolumeSpecName: "utilities") pod "49e1c321-1087-47b4-a9ef-446e4cef558e" (UID: "49e1c321-1087-47b4-a9ef-446e4cef558e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.450412 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "f27f5c72-19c7-4d66-b927-0eae532ff4fe" (UID: "f27f5c72-19c7-4d66-b927-0eae532ff4fe"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.450439 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-utilities" (OuterVolumeSpecName: "utilities") pod "bc196990-77bd-4e55-9380-1fa14ec297bf" (UID: "bc196990-77bd-4e55-9380-1fa14ec297bf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.451293 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-utilities" (OuterVolumeSpecName: "utilities") pod "107a869c-7528-417c-a633-e775a88a3cea" (UID: "107a869c-7528-417c-a633-e775a88a3cea"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.471436 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "f27f5c72-19c7-4d66-b927-0eae532ff4fe" (UID: "f27f5c72-19c7-4d66-b927-0eae532ff4fe"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.474618 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49e1c321-1087-47b4-a9ef-446e4cef558e-kube-api-access-l6zzm" (OuterVolumeSpecName: "kube-api-access-l6zzm") pod "49e1c321-1087-47b4-a9ef-446e4cef558e" (UID: "49e1c321-1087-47b4-a9ef-446e4cef558e"). InnerVolumeSpecName "kube-api-access-l6zzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.474616 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/107a869c-7528-417c-a633-e775a88a3cea-kube-api-access-qv5lb" (OuterVolumeSpecName: "kube-api-access-qv5lb") pod "107a869c-7528-417c-a633-e775a88a3cea" (UID: "107a869c-7528-417c-a633-e775a88a3cea"). InnerVolumeSpecName "kube-api-access-qv5lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.474747 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc196990-77bd-4e55-9380-1fa14ec297bf-kube-api-access-9s2p5" (OuterVolumeSpecName: "kube-api-access-9s2p5") pod "bc196990-77bd-4e55-9380-1fa14ec297bf" (UID: "bc196990-77bd-4e55-9380-1fa14ec297bf"). InnerVolumeSpecName "kube-api-access-9s2p5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.476132 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f27f5c72-19c7-4d66-b927-0eae532ff4fe-kube-api-access-4mcpb" (OuterVolumeSpecName: "kube-api-access-4mcpb") pod "f27f5c72-19c7-4d66-b927-0eae532ff4fe" (UID: "f27f5c72-19c7-4d66-b927-0eae532ff4fe"). InnerVolumeSpecName "kube-api-access-4mcpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.498941 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-84nts"] Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.505334 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49e1c321-1087-47b4-a9ef-446e4cef558e" (UID: "49e1c321-1087-47b4-a9ef-446e4cef558e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.506393 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-84nts"] Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.539488 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bc196990-77bd-4e55-9380-1fa14ec297bf" (UID: "bc196990-77bd-4e55-9380-1fa14ec297bf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550657 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qv5lb\" (UniqueName: \"kubernetes.io/projected/107a869c-7528-417c-a633-e775a88a3cea-kube-api-access-qv5lb\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550703 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550717 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550728 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550739 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bc196990-77bd-4e55-9380-1fa14ec297bf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550750 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mcpb\" (UniqueName: \"kubernetes.io/projected/f27f5c72-19c7-4d66-b927-0eae532ff4fe-kube-api-access-4mcpb\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550761 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9s2p5\" (UniqueName: \"kubernetes.io/projected/bc196990-77bd-4e55-9380-1fa14ec297bf-kube-api-access-9s2p5\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550772 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550783 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l6zzm\" (UniqueName: \"kubernetes.io/projected/49e1c321-1087-47b4-a9ef-446e4cef558e-kube-api-access-l6zzm\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550798 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49e1c321-1087-47b4-a9ef-446e4cef558e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.550809 4765 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f27f5c72-19c7-4d66-b927-0eae532ff4fe-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.648242 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "107a869c-7528-417c-a633-e775a88a3cea" (UID: "107a869c-7528-417c-a633-e775a88a3cea"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.652591 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/107a869c-7528-417c-a633-e775a88a3cea-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:08 crc kubenswrapper[4765]: I0319 10:29:08.654814 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6xvzl"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.233442 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-rdfk6" event={"ID":"bc196990-77bd-4e55-9380-1fa14ec297bf","Type":"ContainerDied","Data":"43c336263b1c5e7c25f02a2c288d7781a084db7a8cb47fa3ce0f2c9b8fce5e73"} Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.233522 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-rdfk6" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.233535 4765 scope.go:117] "RemoveContainer" containerID="80c26f1af13018b4e3fd24cb42eff50ad38ed62a68f1711275377a2ad9a000b7" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.238061 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-8mm2f" event={"ID":"49e1c321-1087-47b4-a9ef-446e4cef558e","Type":"ContainerDied","Data":"6cc07598dcd37af7e9f50607e5924fa1560aa41549d1e306d6015fdfc1721caa"} Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.238209 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-8mm2f" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.242818 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9ddtf" event={"ID":"107a869c-7528-417c-a633-e775a88a3cea","Type":"ContainerDied","Data":"f9d69d6a65a35f854845b2540e33624cda093d383f13f1d7a28bd87d81211cf6"} Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.242949 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9ddtf" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.248393 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.248629 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6wpdr" event={"ID":"f27f5c72-19c7-4d66-b927-0eae532ff4fe","Type":"ContainerDied","Data":"29c3ded5eb301c37b9a131acbf93ff931ae30a728a251a8c9e530426184fc0f7"} Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.254543 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" event={"ID":"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3","Type":"ContainerStarted","Data":"fe387cc10b0631ec9550bf7624e5b447095a08be3c50be11854807fbd81c5e91"} Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.254870 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" event={"ID":"5e3d8e97-79f8-43d2-acf6-f20ef33cadd3","Type":"ContainerStarted","Data":"bfe184bf0a81608d40e78286af732548058caebe13cbd6e52dea83eb19a28625"} Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.271010 4765 scope.go:117] "RemoveContainer" containerID="93261144fa403f4f672b2747aa637f1bdf07c249e4afd591e627d816afeaf16b" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.310632 4765 scope.go:117] "RemoveContainer" containerID="0678fb96ef88d0a4aeba650a68a14b4d470047d770487d64a08feeaf8ed63f46" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.320902 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" podStartSLOduration=2.320867336 podStartE2EDuration="2.320867336s" podCreationTimestamp="2026-03-19 10:29:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:29:09.306011993 +0000 UTC m=+447.654957555" watchObservedRunningTime="2026-03-19 10:29:09.320867336 +0000 UTC m=+447.669812868" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.325687 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-rdfk6"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.329661 4765 scope.go:117] "RemoveContainer" containerID="c08c2b344ee219e96638d1552650b343b613249f27a8daec26b6418f5bbefa65" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.333898 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-rdfk6"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.351090 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mm2f"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.352502 4765 scope.go:117] "RemoveContainer" containerID="13f65ac4e255c3cb8fa03fbcf460ee9161f6b15d0875e65545005024b0ef72d4" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.358921 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-8mm2f"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.386104 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpdr"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.392781 4765 scope.go:117] "RemoveContainer" containerID="d25f7d9991331b77ff9502465b20dcc74b66ae55514e873efe7b4b97ab8565d0" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.406876 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6wpdr"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.422017 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-9ddtf"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.426872 4765 scope.go:117] "RemoveContainer" containerID="722b034e36aa3c7894cd3efbf217e9122089608fc33e8bbe8f792e981cadb0f9" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.431559 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-9ddtf"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.478234 4765 scope.go:117] "RemoveContainer" containerID="1d05e4a72115868454b2b59021206d9d77f4e1f1d491dab2074c39f47bf4d8b0" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.499262 4765 scope.go:117] "RemoveContainer" containerID="4bffe71520afa64173fd9fa5f8f8f8745818ba42d88cde640c76b44b74574a5c" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.516751 4765 scope.go:117] "RemoveContainer" containerID="a33e13ece53709b3a00decabb15c5aed0fd61ea1b0da2fee91c5128199dc861c" Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.795361 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl"] Mar 19 10:29:09 crc kubenswrapper[4765]: I0319 10:29:09.795681 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" podUID="ab25744c-43c6-4303-9dd4-048b634054b6" containerName="route-controller-manager" containerID="cri-o://91fe485b8c9e70039e92cd64618ac48ec8c86780ab6016d0913dda548ac314fc" gracePeriod=30 Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.264932 4765 generic.go:334] "Generic (PLEG): container finished" podID="ab25744c-43c6-4303-9dd4-048b634054b6" containerID="91fe485b8c9e70039e92cd64618ac48ec8c86780ab6016d0913dda548ac314fc" exitCode=0 Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.265042 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" event={"ID":"ab25744c-43c6-4303-9dd4-048b634054b6","Type":"ContainerDied","Data":"91fe485b8c9e70039e92cd64618ac48ec8c86780ab6016d0913dda548ac314fc"} Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.265100 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" event={"ID":"ab25744c-43c6-4303-9dd4-048b634054b6","Type":"ContainerDied","Data":"d5eb50f8c87c4f6bfb1bd2d69d5725ec5fd31036d542c7912d4697d6c4190363"} Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.265120 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d5eb50f8c87c4f6bfb1bd2d69d5725ec5fd31036d542c7912d4697d6c4190363" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.266238 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.269592 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6xvzl" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.303133 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.363835 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" path="/var/lib/kubelet/pods/0e281996-1607-4eab-a87f-f4434f4dd17a/volumes" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.364625 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="107a869c-7528-417c-a633-e775a88a3cea" path="/var/lib/kubelet/pods/107a869c-7528-417c-a633-e775a88a3cea/volumes" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.365670 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" path="/var/lib/kubelet/pods/49e1c321-1087-47b4-a9ef-446e4cef558e/volumes" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.367130 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" path="/var/lib/kubelet/pods/bc196990-77bd-4e55-9380-1fa14ec297bf/volumes" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.367819 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" path="/var/lib/kubelet/pods/f27f5c72-19c7-4d66-b927-0eae532ff4fe/volumes" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.383401 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-config\") pod \"ab25744c-43c6-4303-9dd4-048b634054b6\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.383467 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htvsl\" (UniqueName: \"kubernetes.io/projected/ab25744c-43c6-4303-9dd4-048b634054b6-kube-api-access-htvsl\") pod \"ab25744c-43c6-4303-9dd4-048b634054b6\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.383516 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab25744c-43c6-4303-9dd4-048b634054b6-serving-cert\") pod \"ab25744c-43c6-4303-9dd4-048b634054b6\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.383541 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-client-ca\") pod \"ab25744c-43c6-4303-9dd4-048b634054b6\" (UID: \"ab25744c-43c6-4303-9dd4-048b634054b6\") " Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.385120 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-client-ca" (OuterVolumeSpecName: "client-ca") pod "ab25744c-43c6-4303-9dd4-048b634054b6" (UID: "ab25744c-43c6-4303-9dd4-048b634054b6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.385337 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-config" (OuterVolumeSpecName: "config") pod "ab25744c-43c6-4303-9dd4-048b634054b6" (UID: "ab25744c-43c6-4303-9dd4-048b634054b6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.391408 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab25744c-43c6-4303-9dd4-048b634054b6-kube-api-access-htvsl" (OuterVolumeSpecName: "kube-api-access-htvsl") pod "ab25744c-43c6-4303-9dd4-048b634054b6" (UID: "ab25744c-43c6-4303-9dd4-048b634054b6"). InnerVolumeSpecName "kube-api-access-htvsl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.417674 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab25744c-43c6-4303-9dd4-048b634054b6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "ab25744c-43c6-4303-9dd4-048b634054b6" (UID: "ab25744c-43c6-4303-9dd4-048b634054b6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426489 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-cd4rs"] Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426739 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426754 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426762 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426768 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426774 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426780 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426787 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426794 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426808 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426815 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426826 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426832 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426840 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426846 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426858 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426864 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426872 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426878 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="extract-content" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426886 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426893 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426904 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426912 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="extract-utilities" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426947 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426958 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.426985 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.426993 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.427005 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427011 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" Mar 19 10:29:10 crc kubenswrapper[4765]: E0319 10:29:10.427018 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab25744c-43c6-4303-9dd4-048b634054b6" containerName="route-controller-manager" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427024 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab25744c-43c6-4303-9dd4-048b634054b6" containerName="route-controller-manager" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427116 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="107a869c-7528-417c-a633-e775a88a3cea" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427129 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="49e1c321-1087-47b4-a9ef-446e4cef558e" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427139 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab25744c-43c6-4303-9dd4-048b634054b6" containerName="route-controller-manager" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427147 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc196990-77bd-4e55-9380-1fa14ec297bf" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427154 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427162 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0e281996-1607-4eab-a87f-f4434f4dd17a" containerName="registry-server" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427167 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f27f5c72-19c7-4d66-b927-0eae532ff4fe" containerName="marketplace-operator" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.427906 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.431215 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.439928 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cd4rs"] Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.485287 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00429786-088e-4d39-be6e-050615aeba42-catalog-content\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.485773 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tczn5\" (UniqueName: \"kubernetes.io/projected/00429786-088e-4d39-be6e-050615aeba42-kube-api-access-tczn5\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.487482 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00429786-088e-4d39-be6e-050615aeba42-utilities\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.487631 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab25744c-43c6-4303-9dd4-048b634054b6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.487653 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.487665 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab25744c-43c6-4303-9dd4-048b634054b6-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.487676 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htvsl\" (UniqueName: \"kubernetes.io/projected/ab25744c-43c6-4303-9dd4-048b634054b6-kube-api-access-htvsl\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.589073 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00429786-088e-4d39-be6e-050615aeba42-utilities\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.589497 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00429786-088e-4d39-be6e-050615aeba42-catalog-content\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.589607 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tczn5\" (UniqueName: \"kubernetes.io/projected/00429786-088e-4d39-be6e-050615aeba42-kube-api-access-tczn5\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.589759 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/00429786-088e-4d39-be6e-050615aeba42-utilities\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.590077 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/00429786-088e-4d39-be6e-050615aeba42-catalog-content\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.611611 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tczn5\" (UniqueName: \"kubernetes.io/projected/00429786-088e-4d39-be6e-050615aeba42-kube-api-access-tczn5\") pod \"certified-operators-cd4rs\" (UID: \"00429786-088e-4d39-be6e-050615aeba42\") " pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.613432 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-kqvrg"] Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.614816 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.618411 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqvrg"] Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.619436 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.691241 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-utilities\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.691397 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-catalog-content\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.691430 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-29n7v\" (UniqueName: \"kubernetes.io/projected/306b7883-2d24-4ff3-9154-8a45be8447fb-kube-api-access-29n7v\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.755392 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.793378 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-catalog-content\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.793435 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-29n7v\" (UniqueName: \"kubernetes.io/projected/306b7883-2d24-4ff3-9154-8a45be8447fb-kube-api-access-29n7v\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.793579 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-utilities\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.794413 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-utilities\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.794548 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-catalog-content\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.817099 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-29n7v\" (UniqueName: \"kubernetes.io/projected/306b7883-2d24-4ff3-9154-8a45be8447fb-kube-api-access-29n7v\") pod \"community-operators-kqvrg\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:10 crc kubenswrapper[4765]: I0319 10:29:10.962286 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.125094 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd"] Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.126071 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.137211 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd"] Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.207026 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c7e574-0448-4c1c-898d-694736cde753-client-ca\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.207242 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jb5l\" (UniqueName: \"kubernetes.io/projected/d0c7e574-0448-4c1c-898d-694736cde753-kube-api-access-9jb5l\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.207351 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c7e574-0448-4c1c-898d-694736cde753-serving-cert\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.207382 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c7e574-0448-4c1c-898d-694736cde753-config\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.211236 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-cd4rs"] Mar 19 10:29:11 crc kubenswrapper[4765]: W0319 10:29:11.228413 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00429786_088e_4d39_be6e_050615aeba42.slice/crio-0e7f58eca4435b256b32efcc0fc17c50362d6af9447f6bcfa63ff5cc83a62a2d WatchSource:0}: Error finding container 0e7f58eca4435b256b32efcc0fc17c50362d6af9447f6bcfa63ff5cc83a62a2d: Status 404 returned error can't find the container with id 0e7f58eca4435b256b32efcc0fc17c50362d6af9447f6bcfa63ff5cc83a62a2d Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.273703 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.273702 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cd4rs" event={"ID":"00429786-088e-4d39-be6e-050615aeba42","Type":"ContainerStarted","Data":"0e7f58eca4435b256b32efcc0fc17c50362d6af9447f6bcfa63ff5cc83a62a2d"} Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.308745 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c7e574-0448-4c1c-898d-694736cde753-serving-cert\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.308915 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c7e574-0448-4c1c-898d-694736cde753-config\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.308998 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c7e574-0448-4c1c-898d-694736cde753-client-ca\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.309071 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jb5l\" (UniqueName: \"kubernetes.io/projected/d0c7e574-0448-4c1c-898d-694736cde753-kube-api-access-9jb5l\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.309908 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl"] Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.310528 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0c7e574-0448-4c1c-898d-694736cde753-client-ca\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.311229 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0c7e574-0448-4c1c-898d-694736cde753-config\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.313723 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0c7e574-0448-4c1c-898d-694736cde753-serving-cert\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.314746 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6f9c758cfc-d5rzl"] Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.338902 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jb5l\" (UniqueName: \"kubernetes.io/projected/d0c7e574-0448-4c1c-898d-694736cde753-kube-api-access-9jb5l\") pod \"route-controller-manager-6db58c4d9b-tcdqd\" (UID: \"d0c7e574-0448-4c1c-898d-694736cde753\") " pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.374478 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-kqvrg"] Mar 19 10:29:11 crc kubenswrapper[4765]: W0319 10:29:11.382392 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod306b7883_2d24_4ff3_9154_8a45be8447fb.slice/crio-0b3ca35060e2d5959765e1a6d0a52873c31dc1f10de60f849dd1e565e492586b WatchSource:0}: Error finding container 0b3ca35060e2d5959765e1a6d0a52873c31dc1f10de60f849dd1e565e492586b: Status 404 returned error can't find the container with id 0b3ca35060e2d5959765e1a6d0a52873c31dc1f10de60f849dd1e565e492586b Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.448365 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:11 crc kubenswrapper[4765]: I0319 10:29:11.865026 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd"] Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.292785 4765 generic.go:334] "Generic (PLEG): container finished" podID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerID="f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2" exitCode=0 Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.292929 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqvrg" event={"ID":"306b7883-2d24-4ff3-9154-8a45be8447fb","Type":"ContainerDied","Data":"f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2"} Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.293025 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqvrg" event={"ID":"306b7883-2d24-4ff3-9154-8a45be8447fb","Type":"ContainerStarted","Data":"0b3ca35060e2d5959765e1a6d0a52873c31dc1f10de60f849dd1e565e492586b"} Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.295907 4765 generic.go:334] "Generic (PLEG): container finished" podID="00429786-088e-4d39-be6e-050615aeba42" containerID="3c367867de4ead56d5c6d179f902339622bb43558caa4ab87c6a087810ad13b7" exitCode=0 Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.296018 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cd4rs" event={"ID":"00429786-088e-4d39-be6e-050615aeba42","Type":"ContainerDied","Data":"3c367867de4ead56d5c6d179f902339622bb43558caa4ab87c6a087810ad13b7"} Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.298482 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" event={"ID":"d0c7e574-0448-4c1c-898d-694736cde753","Type":"ContainerStarted","Data":"27790aa3a508be0ef3f50b12c61fe36af0feafee569be1ced2799e6ecc2a6674"} Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.298543 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" event={"ID":"d0c7e574-0448-4c1c-898d-694736cde753","Type":"ContainerStarted","Data":"bf25d624d92e96b0e8b7fd948b60c6e2fc20eca7edb582bfc191bda008ddded0"} Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.298797 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.338511 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" podStartSLOduration=3.338458466 podStartE2EDuration="3.338458466s" podCreationTimestamp="2026-03-19 10:29:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:29:12.335908577 +0000 UTC m=+450.684854139" watchObservedRunningTime="2026-03-19 10:29:12.338458466 +0000 UTC m=+450.687404018" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.366696 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab25744c-43c6-4303-9dd4-048b634054b6" path="/var/lib/kubelet/pods/ab25744c-43c6-4303-9dd4-048b634054b6/volumes" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.522407 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6db58c4d9b-tcdqd" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.609600 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dcfzm"] Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.610709 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.635987 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dcfzm"] Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.730627 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.730737 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-registry-certificates\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.730767 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.730806 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnqsq\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-kube-api-access-wnqsq\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.731063 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-trusted-ca\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.731126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-bound-sa-token\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.731172 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.731212 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-registry-tls\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.765309 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.806864 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfw"] Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.808106 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.810552 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.819438 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfw"] Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.832394 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-registry-certificates\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.832447 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.832504 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnqsq\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-kube-api-access-wnqsq\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.832546 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-trusted-ca\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.832565 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-bound-sa-token\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.832586 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-registry-tls\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.832636 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.833290 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-ca-trust-extracted\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.835223 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-registry-certificates\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.835369 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-trusted-ca\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.840454 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-installation-pull-secrets\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.840864 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-registry-tls\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.852071 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-bound-sa-token\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.852496 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnqsq\" (UniqueName: \"kubernetes.io/projected/f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e-kube-api-access-wnqsq\") pod \"image-registry-66df7c8f76-dcfzm\" (UID: \"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e\") " pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.934119 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zt5pk\" (UniqueName: \"kubernetes.io/projected/df325686-add0-407b-afdf-f9093391d64c-kube-api-access-zt5pk\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.934213 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df325686-add0-407b-afdf-f9093391d64c-catalog-content\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.934288 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df325686-add0-407b-afdf-f9093391d64c-utilities\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:12 crc kubenswrapper[4765]: I0319 10:29:12.934547 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.006624 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-d2rb7"] Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.008320 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.011091 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.019010 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2rb7"] Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.035388 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df325686-add0-407b-afdf-f9093391d64c-utilities\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.035509 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zt5pk\" (UniqueName: \"kubernetes.io/projected/df325686-add0-407b-afdf-f9093391d64c-kube-api-access-zt5pk\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.035803 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/df325686-add0-407b-afdf-f9093391d64c-utilities\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.035913 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df325686-add0-407b-afdf-f9093391d64c-catalog-content\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.036186 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/df325686-add0-407b-afdf-f9093391d64c-catalog-content\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.055049 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zt5pk\" (UniqueName: \"kubernetes.io/projected/df325686-add0-407b-afdf-f9093391d64c-kube-api-access-zt5pk\") pod \"redhat-marketplace-9chfw\" (UID: \"df325686-add0-407b-afdf-f9093391d64c\") " pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.131152 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.138032 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-catalog-content\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.138129 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gl84n\" (UniqueName: \"kubernetes.io/projected/bb857f38-edd5-4cd5-9004-3f1737f6aec8-kube-api-access-gl84n\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.138240 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-utilities\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.240322 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-catalog-content\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.240395 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gl84n\" (UniqueName: \"kubernetes.io/projected/bb857f38-edd5-4cd5-9004-3f1737f6aec8-kube-api-access-gl84n\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.240477 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-utilities\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.240889 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-catalog-content\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.240952 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-utilities\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.258915 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gl84n\" (UniqueName: \"kubernetes.io/projected/bb857f38-edd5-4cd5-9004-3f1737f6aec8-kube-api-access-gl84n\") pod \"redhat-operators-d2rb7\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.334004 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.687785 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-dcfzm"] Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.820112 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9chfw"] Mar 19 10:29:13 crc kubenswrapper[4765]: I0319 10:29:13.939119 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-d2rb7"] Mar 19 10:29:13 crc kubenswrapper[4765]: W0319 10:29:13.980827 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbb857f38_edd5_4cd5_9004_3f1737f6aec8.slice/crio-43689b0ad00a34ff1e647d1aaf15c888c472e853dc4275c834102ae1504acc1d WatchSource:0}: Error finding container 43689b0ad00a34ff1e647d1aaf15c888c472e853dc4275c834102ae1504acc1d: Status 404 returned error can't find the container with id 43689b0ad00a34ff1e647d1aaf15c888c472e853dc4275c834102ae1504acc1d Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.313206 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2rb7" event={"ID":"bb857f38-edd5-4cd5-9004-3f1737f6aec8","Type":"ContainerStarted","Data":"b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.313617 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2rb7" event={"ID":"bb857f38-edd5-4cd5-9004-3f1737f6aec8","Type":"ContainerStarted","Data":"43689b0ad00a34ff1e647d1aaf15c888c472e853dc4275c834102ae1504acc1d"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.315261 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" event={"ID":"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e","Type":"ContainerStarted","Data":"798bd7dcff016a3b4ca2656048031342560a425f2375e079065dc25358a68acd"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.315323 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" event={"ID":"f2a2f96a-c7a4-499f-a1fd-02e5cec23f6e","Type":"ContainerStarted","Data":"886050f61491b3a27dd00e8c53ec2f2dc5844fbca28a78c4d491ab525d79ce41"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.315403 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.318094 4765 generic.go:334] "Generic (PLEG): container finished" podID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerID="ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4" exitCode=0 Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.318274 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqvrg" event={"ID":"306b7883-2d24-4ff3-9154-8a45be8447fb","Type":"ContainerDied","Data":"ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.320335 4765 generic.go:334] "Generic (PLEG): container finished" podID="df325686-add0-407b-afdf-f9093391d64c" containerID="63ba49088ed587cafabba494d7cb2df91a17b82870296f93d84258431194a04b" exitCode=0 Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.320416 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfw" event={"ID":"df325686-add0-407b-afdf-f9093391d64c","Type":"ContainerDied","Data":"63ba49088ed587cafabba494d7cb2df91a17b82870296f93d84258431194a04b"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.320460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfw" event={"ID":"df325686-add0-407b-afdf-f9093391d64c","Type":"ContainerStarted","Data":"59a3283d4937cf97d39336e56fdb07e1409eee13275dcc36827623765d5d052f"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.323538 4765 generic.go:334] "Generic (PLEG): container finished" podID="00429786-088e-4d39-be6e-050615aeba42" containerID="c728dc0f997eac2f7a19da11e4fb42d6a5fc6029f3ffd6855041eb91522ff270" exitCode=0 Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.323603 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cd4rs" event={"ID":"00429786-088e-4d39-be6e-050615aeba42","Type":"ContainerDied","Data":"c728dc0f997eac2f7a19da11e4fb42d6a5fc6029f3ffd6855041eb91522ff270"} Mar 19 10:29:14 crc kubenswrapper[4765]: I0319 10:29:14.344681 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" podStartSLOduration=2.344652402 podStartE2EDuration="2.344652402s" podCreationTimestamp="2026-03-19 10:29:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:29:14.343302475 +0000 UTC m=+452.692248017" watchObservedRunningTime="2026-03-19 10:29:14.344652402 +0000 UTC m=+452.693597954" Mar 19 10:29:15 crc kubenswrapper[4765]: I0319 10:29:15.330652 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-cd4rs" event={"ID":"00429786-088e-4d39-be6e-050615aeba42","Type":"ContainerStarted","Data":"fc39879468b07433b0d74d99c412d1f42bec36175eb427cba5909f166aba4fa5"} Mar 19 10:29:15 crc kubenswrapper[4765]: I0319 10:29:15.336079 4765 generic.go:334] "Generic (PLEG): container finished" podID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerID="b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44" exitCode=0 Mar 19 10:29:15 crc kubenswrapper[4765]: I0319 10:29:15.336158 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2rb7" event={"ID":"bb857f38-edd5-4cd5-9004-3f1737f6aec8","Type":"ContainerDied","Data":"b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44"} Mar 19 10:29:15 crc kubenswrapper[4765]: I0319 10:29:15.340840 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqvrg" event={"ID":"306b7883-2d24-4ff3-9154-8a45be8447fb","Type":"ContainerStarted","Data":"640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba"} Mar 19 10:29:15 crc kubenswrapper[4765]: I0319 10:29:15.379025 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-cd4rs" podStartSLOduration=2.607034271 podStartE2EDuration="5.378997299s" podCreationTimestamp="2026-03-19 10:29:10 +0000 UTC" firstStartedPulling="2026-03-19 10:29:12.297059531 +0000 UTC m=+450.646005073" lastFinishedPulling="2026-03-19 10:29:15.069022559 +0000 UTC m=+453.417968101" observedRunningTime="2026-03-19 10:29:15.359225452 +0000 UTC m=+453.708171004" watchObservedRunningTime="2026-03-19 10:29:15.378997299 +0000 UTC m=+453.727942861" Mar 19 10:29:15 crc kubenswrapper[4765]: I0319 10:29:15.410346 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-kqvrg" podStartSLOduration=2.786575199 podStartE2EDuration="5.41031955s" podCreationTimestamp="2026-03-19 10:29:10 +0000 UTC" firstStartedPulling="2026-03-19 10:29:12.295295483 +0000 UTC m=+450.644241025" lastFinishedPulling="2026-03-19 10:29:14.919039834 +0000 UTC m=+453.267985376" observedRunningTime="2026-03-19 10:29:15.404919723 +0000 UTC m=+453.753865265" watchObservedRunningTime="2026-03-19 10:29:15.41031955 +0000 UTC m=+453.759265092" Mar 19 10:29:16 crc kubenswrapper[4765]: I0319 10:29:16.356709 4765 generic.go:334] "Generic (PLEG): container finished" podID="df325686-add0-407b-afdf-f9093391d64c" containerID="4273d3472f595bd638af5ae175afd2f11e771c54b38f33d34fb64ef2cf2c6f71" exitCode=0 Mar 19 10:29:16 crc kubenswrapper[4765]: I0319 10:29:16.365788 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfw" event={"ID":"df325686-add0-407b-afdf-f9093391d64c","Type":"ContainerDied","Data":"4273d3472f595bd638af5ae175afd2f11e771c54b38f33d34fb64ef2cf2c6f71"} Mar 19 10:29:17 crc kubenswrapper[4765]: I0319 10:29:17.366581 4765 generic.go:334] "Generic (PLEG): container finished" podID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerID="9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3" exitCode=0 Mar 19 10:29:17 crc kubenswrapper[4765]: I0319 10:29:17.366692 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2rb7" event={"ID":"bb857f38-edd5-4cd5-9004-3f1737f6aec8","Type":"ContainerDied","Data":"9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3"} Mar 19 10:29:17 crc kubenswrapper[4765]: I0319 10:29:17.372087 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9chfw" event={"ID":"df325686-add0-407b-afdf-f9093391d64c","Type":"ContainerStarted","Data":"4d2cd0af013490cec8583d8444df4fe170016011b043f6d978915838bda819e0"} Mar 19 10:29:18 crc kubenswrapper[4765]: I0319 10:29:18.381468 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2rb7" event={"ID":"bb857f38-edd5-4cd5-9004-3f1737f6aec8","Type":"ContainerStarted","Data":"8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a"} Mar 19 10:29:18 crc kubenswrapper[4765]: I0319 10:29:18.409848 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-d2rb7" podStartSLOduration=3.754500552 podStartE2EDuration="6.40981411s" podCreationTimestamp="2026-03-19 10:29:12 +0000 UTC" firstStartedPulling="2026-03-19 10:29:15.338268553 +0000 UTC m=+453.687214095" lastFinishedPulling="2026-03-19 10:29:17.993582111 +0000 UTC m=+456.342527653" observedRunningTime="2026-03-19 10:29:18.40025295 +0000 UTC m=+456.749198502" watchObservedRunningTime="2026-03-19 10:29:18.40981411 +0000 UTC m=+456.758759672" Mar 19 10:29:20 crc kubenswrapper[4765]: I0319 10:29:20.756515 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:20 crc kubenswrapper[4765]: I0319 10:29:20.757248 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:20 crc kubenswrapper[4765]: I0319 10:29:20.804527 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:20 crc kubenswrapper[4765]: I0319 10:29:20.829538 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9chfw" podStartSLOduration=6.082655519 podStartE2EDuration="8.829512734s" podCreationTimestamp="2026-03-19 10:29:12 +0000 UTC" firstStartedPulling="2026-03-19 10:29:14.321913014 +0000 UTC m=+452.670858566" lastFinishedPulling="2026-03-19 10:29:17.068770239 +0000 UTC m=+455.417715781" observedRunningTime="2026-03-19 10:29:18.436468894 +0000 UTC m=+456.785414466" watchObservedRunningTime="2026-03-19 10:29:20.829512734 +0000 UTC m=+459.178458276" Mar 19 10:29:20 crc kubenswrapper[4765]: I0319 10:29:20.962747 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:20 crc kubenswrapper[4765]: I0319 10:29:20.962825 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:21 crc kubenswrapper[4765]: I0319 10:29:21.003596 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:21 crc kubenswrapper[4765]: I0319 10:29:21.450948 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-cd4rs" Mar 19 10:29:21 crc kubenswrapper[4765]: I0319 10:29:21.452491 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 10:29:23 crc kubenswrapper[4765]: I0319 10:29:23.132016 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:23 crc kubenswrapper[4765]: I0319 10:29:23.132498 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:23 crc kubenswrapper[4765]: I0319 10:29:23.190094 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:23 crc kubenswrapper[4765]: I0319 10:29:23.334608 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:23 crc kubenswrapper[4765]: I0319 10:29:23.334690 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:23 crc kubenswrapper[4765]: I0319 10:29:23.453365 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9chfw" Mar 19 10:29:24 crc kubenswrapper[4765]: I0319 10:29:24.374756 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-d2rb7" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="registry-server" probeResult="failure" output=< Mar 19 10:29:24 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 10:29:24 crc kubenswrapper[4765]: > Mar 19 10:29:28 crc kubenswrapper[4765]: I0319 10:29:28.390715 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:29:28 crc kubenswrapper[4765]: I0319 10:29:28.407328 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:29:28 crc kubenswrapper[4765]: I0319 10:29:28.492916 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:29:28 crc kubenswrapper[4765]: I0319 10:29:28.500045 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:29:28 crc kubenswrapper[4765]: I0319 10:29:28.556800 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 19 10:29:29 crc kubenswrapper[4765]: W0319 10:29:29.168366 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fe485a1_e14f_4c09_b5b9_f252bc42b7e8.slice/crio-7c5c6239b9fd290f1e5f747d15e86057a0e655e0584c09b268de18772dbe7ca7 WatchSource:0}: Error finding container 7c5c6239b9fd290f1e5f747d15e86057a0e655e0584c09b268de18772dbe7ca7: Status 404 returned error can't find the container with id 7c5c6239b9fd290f1e5f747d15e86057a0e655e0584c09b268de18772dbe7ca7 Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.443568 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"7c5c6239b9fd290f1e5f747d15e86057a0e655e0584c09b268de18772dbe7ca7"} Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.508710 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.508805 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.514465 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.514487 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.556379 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.756173 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.799714 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c54555755-wx24k"] Mar 19 10:29:29 crc kubenswrapper[4765]: I0319 10:29:29.800013 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" podUID="5d2ee826-6d61-492b-a791-443185dc78e6" containerName="controller-manager" containerID="cri-o://bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f" gracePeriod=30 Mar 19 10:29:30 crc kubenswrapper[4765]: W0319 10:29:30.061645 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-4085410db171f8b0f912b09b542c59e33cb828e3ce05a5f114f9db0b270dd1be WatchSource:0}: Error finding container 4085410db171f8b0f912b09b542c59e33cb828e3ce05a5f114f9db0b270dd1be: Status 404 returned error can't find the container with id 4085410db171f8b0f912b09b542c59e33cb828e3ce05a5f114f9db0b270dd1be Mar 19 10:29:30 crc kubenswrapper[4765]: I0319 10:29:30.456297 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"cdd954f6f507831277a3a49b7e57de0549be8b18db05834aaa32fa4b4a85fbcd"} Mar 19 10:29:30 crc kubenswrapper[4765]: I0319 10:29:30.458109 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4085410db171f8b0f912b09b542c59e33cb828e3ce05a5f114f9db0b270dd1be"} Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.394470 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.430109 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5cd4496958-j8p97"] Mar 19 10:29:31 crc kubenswrapper[4765]: E0319 10:29:31.430405 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d2ee826-6d61-492b-a791-443185dc78e6" containerName="controller-manager" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.430421 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d2ee826-6d61-492b-a791-443185dc78e6" containerName="controller-manager" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.430532 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d2ee826-6d61-492b-a791-443185dc78e6" containerName="controller-manager" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.430919 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.447668 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cd4496958-j8p97"] Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.465021 4765 generic.go:334] "Generic (PLEG): container finished" podID="5d2ee826-6d61-492b-a791-443185dc78e6" containerID="bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f" exitCode=0 Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.465096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" event={"ID":"5d2ee826-6d61-492b-a791-443185dc78e6","Type":"ContainerDied","Data":"bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f"} Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.465125 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" event={"ID":"5d2ee826-6d61-492b-a791-443185dc78e6","Type":"ContainerDied","Data":"26bf85529175a64f207e77dbaf5f5c3b6d6470ac0eb00fb176c01fe33f29855b"} Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.465144 4765 scope.go:117] "RemoveContainer" containerID="bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.465256 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6c54555755-wx24k" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.470035 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"4aee60e2bd8620c2ebd9276be5afd1dd15750dc7b7a1c0e8c4656ee2027dfdda"} Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.473419 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"e9208df6c6ced4a363a81042d826871ce10eddaa418aef50919979c2b0246fee"} Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.480620 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"4d816ea7904e6bb2e918d6408e530e11859af7075517928490a41d0b977d9ab7"} Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.481025 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.501501 4765 scope.go:117] "RemoveContainer" containerID="bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f" Mar 19 10:29:31 crc kubenswrapper[4765]: E0319 10:29:31.502146 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f\": container with ID starting with bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f not found: ID does not exist" containerID="bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.502194 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f"} err="failed to get container status \"bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f\": rpc error: code = NotFound desc = could not find container \"bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f\": container with ID starting with bb0fa71879c615553566b2c375fe2265a718361cb0b1da8626ced91bacaf645f not found: ID does not exist" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.542364 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d2ee826-6d61-492b-a791-443185dc78e6-serving-cert\") pod \"5d2ee826-6d61-492b-a791-443185dc78e6\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.542897 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-proxy-ca-bundles\") pod \"5d2ee826-6d61-492b-a791-443185dc78e6\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.542953 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-client-ca\") pod \"5d2ee826-6d61-492b-a791-443185dc78e6\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.543104 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-config\") pod \"5d2ee826-6d61-492b-a791-443185dc78e6\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.543175 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qr4wr\" (UniqueName: \"kubernetes.io/projected/5d2ee826-6d61-492b-a791-443185dc78e6-kube-api-access-qr4wr\") pod \"5d2ee826-6d61-492b-a791-443185dc78e6\" (UID: \"5d2ee826-6d61-492b-a791-443185dc78e6\") " Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.543432 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-proxy-ca-bundles\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.543467 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz5c2\" (UniqueName: \"kubernetes.io/projected/057322e9-c040-4b79-abff-212922df42ee-kube-api-access-kz5c2\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.543495 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057322e9-c040-4b79-abff-212922df42ee-serving-cert\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.544982 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-client-ca\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.545029 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-config\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.550489 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-client-ca" (OuterVolumeSpecName: "client-ca") pod "5d2ee826-6d61-492b-a791-443185dc78e6" (UID: "5d2ee826-6d61-492b-a791-443185dc78e6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.550522 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "5d2ee826-6d61-492b-a791-443185dc78e6" (UID: "5d2ee826-6d61-492b-a791-443185dc78e6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.550545 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-config" (OuterVolumeSpecName: "config") pod "5d2ee826-6d61-492b-a791-443185dc78e6" (UID: "5d2ee826-6d61-492b-a791-443185dc78e6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.554085 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5d2ee826-6d61-492b-a791-443185dc78e6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5d2ee826-6d61-492b-a791-443185dc78e6" (UID: "5d2ee826-6d61-492b-a791-443185dc78e6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.554280 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d2ee826-6d61-492b-a791-443185dc78e6-kube-api-access-qr4wr" (OuterVolumeSpecName: "kube-api-access-qr4wr") pod "5d2ee826-6d61-492b-a791-443185dc78e6" (UID: "5d2ee826-6d61-492b-a791-443185dc78e6"). InnerVolumeSpecName "kube-api-access-qr4wr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646300 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-config\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646377 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-proxy-ca-bundles\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646401 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz5c2\" (UniqueName: \"kubernetes.io/projected/057322e9-c040-4b79-abff-212922df42ee-kube-api-access-kz5c2\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646432 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057322e9-c040-4b79-abff-212922df42ee-serving-cert\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646512 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-client-ca\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646569 4765 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646581 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646590 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qr4wr\" (UniqueName: \"kubernetes.io/projected/5d2ee826-6d61-492b-a791-443185dc78e6-kube-api-access-qr4wr\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646600 4765 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5d2ee826-6d61-492b-a791-443185dc78e6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.646608 4765 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/5d2ee826-6d61-492b-a791-443185dc78e6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.647729 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-client-ca\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.647972 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-config\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.648130 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/057322e9-c040-4b79-abff-212922df42ee-proxy-ca-bundles\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.650580 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/057322e9-c040-4b79-abff-212922df42ee-serving-cert\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.656518 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.656571 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.664312 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz5c2\" (UniqueName: \"kubernetes.io/projected/057322e9-c040-4b79-abff-212922df42ee-kube-api-access-kz5c2\") pod \"controller-manager-5cd4496958-j8p97\" (UID: \"057322e9-c040-4b79-abff-212922df42ee\") " pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.758798 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.834413 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6c54555755-wx24k"] Mar 19 10:29:31 crc kubenswrapper[4765]: I0319 10:29:31.838660 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6c54555755-wx24k"] Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.181259 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5cd4496958-j8p97"] Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.368603 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d2ee826-6d61-492b-a791-443185dc78e6" path="/var/lib/kubelet/pods/5d2ee826-6d61-492b-a791-443185dc78e6/volumes" Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.488498 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" event={"ID":"057322e9-c040-4b79-abff-212922df42ee","Type":"ContainerStarted","Data":"ae3b755a6b3e5e05e132cf3d0f38fd8cfdc7ad370a7ab7f767d08951fc75aa08"} Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.488590 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" event={"ID":"057322e9-c040-4b79-abff-212922df42ee","Type":"ContainerStarted","Data":"4575f86e3aa30e40d7ac0e3c1c94bb6cfe77643fbc1601667cfdb759e01e28b2"} Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.488753 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.494655 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.510352 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5cd4496958-j8p97" podStartSLOduration=3.510332899 podStartE2EDuration="3.510332899s" podCreationTimestamp="2026-03-19 10:29:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:29:32.508539278 +0000 UTC m=+470.857484820" watchObservedRunningTime="2026-03-19 10:29:32.510332899 +0000 UTC m=+470.859278451" Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.942379 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-dcfzm" Mar 19 10:29:32 crc kubenswrapper[4765]: I0319 10:29:32.993249 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x94pq"] Mar 19 10:29:33 crc kubenswrapper[4765]: I0319 10:29:33.377615 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:33 crc kubenswrapper[4765]: I0319 10:29:33.440685 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:29:58 crc kubenswrapper[4765]: I0319 10:29:58.034062 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" podUID="54a73adb-452b-4db3-9bc2-3411d1575eb5" containerName="registry" containerID="cri-o://375b68b916b3cbf45f22c93be49ba7ca4585c492f6ed54e350a958d1f07a4aac" gracePeriod=30 Mar 19 10:29:58 crc kubenswrapper[4765]: I0319 10:29:58.701132 4765 generic.go:334] "Generic (PLEG): container finished" podID="54a73adb-452b-4db3-9bc2-3411d1575eb5" containerID="375b68b916b3cbf45f22c93be49ba7ca4585c492f6ed54e350a958d1f07a4aac" exitCode=0 Mar 19 10:29:58 crc kubenswrapper[4765]: I0319 10:29:58.701186 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" event={"ID":"54a73adb-452b-4db3-9bc2-3411d1575eb5","Type":"ContainerDied","Data":"375b68b916b3cbf45f22c93be49ba7ca4585c492f6ed54e350a958d1f07a4aac"} Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.068058 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.156404 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-bound-sa-token\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.156466 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-certificates\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.156518 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-trusted-ca\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.156570 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fdxhv\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-kube-api-access-fdxhv\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.156970 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.157019 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54a73adb-452b-4db3-9bc2-3411d1575eb5-installation-pull-secrets\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.157052 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54a73adb-452b-4db3-9bc2-3411d1575eb5-ca-trust-extracted\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.157101 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-tls\") pod \"54a73adb-452b-4db3-9bc2-3411d1575eb5\" (UID: \"54a73adb-452b-4db3-9bc2-3411d1575eb5\") " Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.157900 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.158460 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.163445 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a73adb-452b-4db3-9bc2-3411d1575eb5-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.163542 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.167589 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.168414 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-kube-api-access-fdxhv" (OuterVolumeSpecName: "kube-api-access-fdxhv") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "kube-api-access-fdxhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.168606 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.174714 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a73adb-452b-4db3-9bc2-3411d1575eb5-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "54a73adb-452b-4db3-9bc2-3411d1575eb5" (UID: "54a73adb-452b-4db3-9bc2-3411d1575eb5"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.258764 4765 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.258813 4765 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.258839 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/54a73adb-452b-4db3-9bc2-3411d1575eb5-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.258850 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fdxhv\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-kube-api-access-fdxhv\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.258861 4765 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/54a73adb-452b-4db3-9bc2-3411d1575eb5-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.258871 4765 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/54a73adb-452b-4db3-9bc2-3411d1575eb5-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.258880 4765 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/54a73adb-452b-4db3-9bc2-3411d1575eb5-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.709610 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" event={"ID":"54a73adb-452b-4db3-9bc2-3411d1575eb5","Type":"ContainerDied","Data":"3a4e30cce45f9fbeafe7d5650bffe86e10ebac6bb3acb8c1227f488948fa0bcc"} Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.709674 4765 scope.go:117] "RemoveContainer" containerID="375b68b916b3cbf45f22c93be49ba7ca4585c492f6ed54e350a958d1f07a4aac" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.709798 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-x94pq" Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.737054 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x94pq"] Mar 19 10:29:59 crc kubenswrapper[4765]: I0319 10:29:59.741493 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-x94pq"] Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.142116 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796"] Mar 19 10:30:00 crc kubenswrapper[4765]: E0319 10:30:00.142377 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="54a73adb-452b-4db3-9bc2-3411d1575eb5" containerName="registry" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.142390 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a73adb-452b-4db3-9bc2-3411d1575eb5" containerName="registry" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.142520 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="54a73adb-452b-4db3-9bc2-3411d1575eb5" containerName="registry" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.142925 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.145225 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.145455 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.149044 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565270-xhklx"] Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.150698 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-xhklx" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.152475 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.154006 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.154285 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.156084 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796"] Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.162364 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-xhklx"] Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.270887 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rd6df\" (UniqueName: \"kubernetes.io/projected/665575bc-6ce3-4177-9229-2fe41e45fced-kube-api-access-rd6df\") pod \"auto-csr-approver-29565270-xhklx\" (UID: \"665575bc-6ce3-4177-9229-2fe41e45fced\") " pod="openshift-infra/auto-csr-approver-29565270-xhklx" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.270935 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v4vsc\" (UniqueName: \"kubernetes.io/projected/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-kube-api-access-v4vsc\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.271070 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-secret-volume\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.271130 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-config-volume\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.362981 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a73adb-452b-4db3-9bc2-3411d1575eb5" path="/var/lib/kubelet/pods/54a73adb-452b-4db3-9bc2-3411d1575eb5/volumes" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.372604 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-config-volume\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.372679 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rd6df\" (UniqueName: \"kubernetes.io/projected/665575bc-6ce3-4177-9229-2fe41e45fced-kube-api-access-rd6df\") pod \"auto-csr-approver-29565270-xhklx\" (UID: \"665575bc-6ce3-4177-9229-2fe41e45fced\") " pod="openshift-infra/auto-csr-approver-29565270-xhklx" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.372716 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v4vsc\" (UniqueName: \"kubernetes.io/projected/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-kube-api-access-v4vsc\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.372791 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-secret-volume\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.373882 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-config-volume\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.378685 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-secret-volume\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.390949 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v4vsc\" (UniqueName: \"kubernetes.io/projected/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-kube-api-access-v4vsc\") pod \"collect-profiles-29565270-9c796\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.391740 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rd6df\" (UniqueName: \"kubernetes.io/projected/665575bc-6ce3-4177-9229-2fe41e45fced-kube-api-access-rd6df\") pod \"auto-csr-approver-29565270-xhklx\" (UID: \"665575bc-6ce3-4177-9229-2fe41e45fced\") " pod="openshift-infra/auto-csr-approver-29565270-xhklx" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.462787 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.485989 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-xhklx" Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.859997 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796"] Mar 19 10:30:00 crc kubenswrapper[4765]: W0319 10:30:00.867504 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6b8d3c9f_2553_44cf_971d_27dec0e5f66e.slice/crio-ef5af805ff7f1c74e2cd0a007bdc556570b3cc46fbacf26058034b44628795c0 WatchSource:0}: Error finding container ef5af805ff7f1c74e2cd0a007bdc556570b3cc46fbacf26058034b44628795c0: Status 404 returned error can't find the container with id ef5af805ff7f1c74e2cd0a007bdc556570b3cc46fbacf26058034b44628795c0 Mar 19 10:30:00 crc kubenswrapper[4765]: I0319 10:30:00.931421 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-xhklx"] Mar 19 10:30:01 crc kubenswrapper[4765]: I0319 10:30:01.656202 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:30:01 crc kubenswrapper[4765]: I0319 10:30:01.656637 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:30:01 crc kubenswrapper[4765]: I0319 10:30:01.730464 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565270-xhklx" event={"ID":"665575bc-6ce3-4177-9229-2fe41e45fced","Type":"ContainerStarted","Data":"e38c223e3b12c0dd1dad93084c2b77f5deb8e4bd7b7fc95cc60b88ddfa28e1c9"} Mar 19 10:30:01 crc kubenswrapper[4765]: I0319 10:30:01.732592 4765 generic.go:334] "Generic (PLEG): container finished" podID="6b8d3c9f-2553-44cf-971d-27dec0e5f66e" containerID="49e18886eb0d4d2303675edd42e0a64d9cd5e3bee8e0e35fef0dda43405fcb89" exitCode=0 Mar 19 10:30:01 crc kubenswrapper[4765]: I0319 10:30:01.732651 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" event={"ID":"6b8d3c9f-2553-44cf-971d-27dec0e5f66e","Type":"ContainerDied","Data":"49e18886eb0d4d2303675edd42e0a64d9cd5e3bee8e0e35fef0dda43405fcb89"} Mar 19 10:30:01 crc kubenswrapper[4765]: I0319 10:30:01.732691 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" event={"ID":"6b8d3c9f-2553-44cf-971d-27dec0e5f66e","Type":"ContainerStarted","Data":"ef5af805ff7f1c74e2cd0a007bdc556570b3cc46fbacf26058034b44628795c0"} Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.065602 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.212597 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-config-volume\") pod \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.212737 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v4vsc\" (UniqueName: \"kubernetes.io/projected/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-kube-api-access-v4vsc\") pod \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.212778 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-secret-volume\") pod \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\" (UID: \"6b8d3c9f-2553-44cf-971d-27dec0e5f66e\") " Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.213683 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-config-volume" (OuterVolumeSpecName: "config-volume") pod "6b8d3c9f-2553-44cf-971d-27dec0e5f66e" (UID: "6b8d3c9f-2553-44cf-971d-27dec0e5f66e"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.219225 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-kube-api-access-v4vsc" (OuterVolumeSpecName: "kube-api-access-v4vsc") pod "6b8d3c9f-2553-44cf-971d-27dec0e5f66e" (UID: "6b8d3c9f-2553-44cf-971d-27dec0e5f66e"). InnerVolumeSpecName "kube-api-access-v4vsc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.220310 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "6b8d3c9f-2553-44cf-971d-27dec0e5f66e" (UID: "6b8d3c9f-2553-44cf-971d-27dec0e5f66e"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.314682 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v4vsc\" (UniqueName: \"kubernetes.io/projected/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-kube-api-access-v4vsc\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.314746 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.314760 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/6b8d3c9f-2553-44cf-971d-27dec0e5f66e-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.745507 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" event={"ID":"6b8d3c9f-2553-44cf-971d-27dec0e5f66e","Type":"ContainerDied","Data":"ef5af805ff7f1c74e2cd0a007bdc556570b3cc46fbacf26058034b44628795c0"} Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.745579 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796" Mar 19 10:30:03 crc kubenswrapper[4765]: I0319 10:30:03.745595 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef5af805ff7f1c74e2cd0a007bdc556570b3cc46fbacf26058034b44628795c0" Mar 19 10:30:04 crc kubenswrapper[4765]: I0319 10:30:04.751701 4765 generic.go:334] "Generic (PLEG): container finished" podID="665575bc-6ce3-4177-9229-2fe41e45fced" containerID="9c2a9d1ed92b4f97e449d5a1f6e5e1be3bc702936ec8effeea98a03643002fc3" exitCode=0 Mar 19 10:30:04 crc kubenswrapper[4765]: I0319 10:30:04.751932 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565270-xhklx" event={"ID":"665575bc-6ce3-4177-9229-2fe41e45fced","Type":"ContainerDied","Data":"9c2a9d1ed92b4f97e449d5a1f6e5e1be3bc702936ec8effeea98a03643002fc3"} Mar 19 10:30:06 crc kubenswrapper[4765]: I0319 10:30:06.093069 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-xhklx" Mar 19 10:30:06 crc kubenswrapper[4765]: I0319 10:30:06.255195 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rd6df\" (UniqueName: \"kubernetes.io/projected/665575bc-6ce3-4177-9229-2fe41e45fced-kube-api-access-rd6df\") pod \"665575bc-6ce3-4177-9229-2fe41e45fced\" (UID: \"665575bc-6ce3-4177-9229-2fe41e45fced\") " Mar 19 10:30:06 crc kubenswrapper[4765]: I0319 10:30:06.262156 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/665575bc-6ce3-4177-9229-2fe41e45fced-kube-api-access-rd6df" (OuterVolumeSpecName: "kube-api-access-rd6df") pod "665575bc-6ce3-4177-9229-2fe41e45fced" (UID: "665575bc-6ce3-4177-9229-2fe41e45fced"). InnerVolumeSpecName "kube-api-access-rd6df". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:30:06 crc kubenswrapper[4765]: I0319 10:30:06.356803 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rd6df\" (UniqueName: \"kubernetes.io/projected/665575bc-6ce3-4177-9229-2fe41e45fced-kube-api-access-rd6df\") on node \"crc\" DevicePath \"\"" Mar 19 10:30:06 crc kubenswrapper[4765]: I0319 10:30:06.765987 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565270-xhklx" event={"ID":"665575bc-6ce3-4177-9229-2fe41e45fced","Type":"ContainerDied","Data":"e38c223e3b12c0dd1dad93084c2b77f5deb8e4bd7b7fc95cc60b88ddfa28e1c9"} Mar 19 10:30:06 crc kubenswrapper[4765]: I0319 10:30:06.766032 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e38c223e3b12c0dd1dad93084c2b77f5deb8e4bd7b7fc95cc60b88ddfa28e1c9" Mar 19 10:30:06 crc kubenswrapper[4765]: I0319 10:30:06.766084 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565270-xhklx" Mar 19 10:30:07 crc kubenswrapper[4765]: I0319 10:30:07.165399 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-nvg5v"] Mar 19 10:30:07 crc kubenswrapper[4765]: I0319 10:30:07.169274 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565264-nvg5v"] Mar 19 10:30:08 crc kubenswrapper[4765]: I0319 10:30:08.363236 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5495eef-efca-4df2-81bb-bd93bb2f8a38" path="/var/lib/kubelet/pods/c5495eef-efca-4df2-81bb-bd93bb2f8a38/volumes" Mar 19 10:30:09 crc kubenswrapper[4765]: I0319 10:30:09.764593 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.656893 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.657691 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.657768 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.658575 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"77d960355abace30efa3218c2c2218608f2c437a4f4180a38603f11b6f6f7a6e"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.658646 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://77d960355abace30efa3218c2c2218608f2c437a4f4180a38603f11b6f6f7a6e" gracePeriod=600 Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.918084 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="77d960355abace30efa3218c2c2218608f2c437a4f4180a38603f11b6f6f7a6e" exitCode=0 Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.918412 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"77d960355abace30efa3218c2c2218608f2c437a4f4180a38603f11b6f6f7a6e"} Mar 19 10:30:31 crc kubenswrapper[4765]: I0319 10:30:31.918920 4765 scope.go:117] "RemoveContainer" containerID="bac2fb12c527f8417dc08378065d3f56b5cc55052743230c4bb2f532ca6be00d" Mar 19 10:30:32 crc kubenswrapper[4765]: I0319 10:30:32.931564 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"52c282ccaa9b12441cda9329a58912d852bd314df6cce7dba63b49f9309b4b08"} Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.151090 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565272-c8lkp"] Mar 19 10:32:00 crc kubenswrapper[4765]: E0319 10:32:00.154222 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b8d3c9f-2553-44cf-971d-27dec0e5f66e" containerName="collect-profiles" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.154372 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b8d3c9f-2553-44cf-971d-27dec0e5f66e" containerName="collect-profiles" Mar 19 10:32:00 crc kubenswrapper[4765]: E0319 10:32:00.154455 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="665575bc-6ce3-4177-9229-2fe41e45fced" containerName="oc" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.154516 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="665575bc-6ce3-4177-9229-2fe41e45fced" containerName="oc" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.154673 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="665575bc-6ce3-4177-9229-2fe41e45fced" containerName="oc" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.154759 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b8d3c9f-2553-44cf-971d-27dec0e5f66e" containerName="collect-profiles" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.157339 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-c8lkp"] Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.157569 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-c8lkp" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.160734 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.161033 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.161269 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.303831 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l45rc\" (UniqueName: \"kubernetes.io/projected/24961073-0b01-4ce3-8929-e916408b3431-kube-api-access-l45rc\") pod \"auto-csr-approver-29565272-c8lkp\" (UID: \"24961073-0b01-4ce3-8929-e916408b3431\") " pod="openshift-infra/auto-csr-approver-29565272-c8lkp" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.405166 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l45rc\" (UniqueName: \"kubernetes.io/projected/24961073-0b01-4ce3-8929-e916408b3431-kube-api-access-l45rc\") pod \"auto-csr-approver-29565272-c8lkp\" (UID: \"24961073-0b01-4ce3-8929-e916408b3431\") " pod="openshift-infra/auto-csr-approver-29565272-c8lkp" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.431717 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l45rc\" (UniqueName: \"kubernetes.io/projected/24961073-0b01-4ce3-8929-e916408b3431-kube-api-access-l45rc\") pod \"auto-csr-approver-29565272-c8lkp\" (UID: \"24961073-0b01-4ce3-8929-e916408b3431\") " pod="openshift-infra/auto-csr-approver-29565272-c8lkp" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.481066 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-c8lkp" Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.714739 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-c8lkp"] Mar 19 10:32:00 crc kubenswrapper[4765]: I0319 10:32:00.735639 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:32:01 crc kubenswrapper[4765]: I0319 10:32:01.478617 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565272-c8lkp" event={"ID":"24961073-0b01-4ce3-8929-e916408b3431","Type":"ContainerStarted","Data":"1976d928cd2a25751146bd36ea087ed557b8e7d7b4cb9cdf58440cfde0645671"} Mar 19 10:32:02 crc kubenswrapper[4765]: I0319 10:32:02.487059 4765 generic.go:334] "Generic (PLEG): container finished" podID="24961073-0b01-4ce3-8929-e916408b3431" containerID="96a65f207ad8e02366d3481f11cd82724013ad6566a214aacb391606ba3a7559" exitCode=0 Mar 19 10:32:02 crc kubenswrapper[4765]: I0319 10:32:02.487191 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565272-c8lkp" event={"ID":"24961073-0b01-4ce3-8929-e916408b3431","Type":"ContainerDied","Data":"96a65f207ad8e02366d3481f11cd82724013ad6566a214aacb391606ba3a7559"} Mar 19 10:32:03 crc kubenswrapper[4765]: I0319 10:32:03.720898 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-c8lkp" Mar 19 10:32:03 crc kubenswrapper[4765]: I0319 10:32:03.855167 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l45rc\" (UniqueName: \"kubernetes.io/projected/24961073-0b01-4ce3-8929-e916408b3431-kube-api-access-l45rc\") pod \"24961073-0b01-4ce3-8929-e916408b3431\" (UID: \"24961073-0b01-4ce3-8929-e916408b3431\") " Mar 19 10:32:03 crc kubenswrapper[4765]: I0319 10:32:03.862835 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24961073-0b01-4ce3-8929-e916408b3431-kube-api-access-l45rc" (OuterVolumeSpecName: "kube-api-access-l45rc") pod "24961073-0b01-4ce3-8929-e916408b3431" (UID: "24961073-0b01-4ce3-8929-e916408b3431"). InnerVolumeSpecName "kube-api-access-l45rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:32:03 crc kubenswrapper[4765]: I0319 10:32:03.957296 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l45rc\" (UniqueName: \"kubernetes.io/projected/24961073-0b01-4ce3-8929-e916408b3431-kube-api-access-l45rc\") on node \"crc\" DevicePath \"\"" Mar 19 10:32:04 crc kubenswrapper[4765]: I0319 10:32:04.503217 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565272-c8lkp" event={"ID":"24961073-0b01-4ce3-8929-e916408b3431","Type":"ContainerDied","Data":"1976d928cd2a25751146bd36ea087ed557b8e7d7b4cb9cdf58440cfde0645671"} Mar 19 10:32:04 crc kubenswrapper[4765]: I0319 10:32:04.503277 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1976d928cd2a25751146bd36ea087ed557b8e7d7b4cb9cdf58440cfde0645671" Mar 19 10:32:04 crc kubenswrapper[4765]: I0319 10:32:04.503331 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565272-c8lkp" Mar 19 10:32:04 crc kubenswrapper[4765]: I0319 10:32:04.789638 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-lgbhn"] Mar 19 10:32:04 crc kubenswrapper[4765]: I0319 10:32:04.792790 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565266-lgbhn"] Mar 19 10:32:06 crc kubenswrapper[4765]: I0319 10:32:06.364120 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf72b802-ec4b-4a38-b575-d037677fe0dc" path="/var/lib/kubelet/pods/cf72b802-ec4b-4a38-b575-d037677fe0dc/volumes" Mar 19 10:32:31 crc kubenswrapper[4765]: I0319 10:32:31.655902 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:32:31 crc kubenswrapper[4765]: I0319 10:32:31.656626 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:33:01 crc kubenswrapper[4765]: I0319 10:33:01.656002 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:33:01 crc kubenswrapper[4765]: I0319 10:33:01.656864 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:33:31 crc kubenswrapper[4765]: I0319 10:33:31.656795 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:33:31 crc kubenswrapper[4765]: I0319 10:33:31.657693 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:33:31 crc kubenswrapper[4765]: I0319 10:33:31.657763 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:33:31 crc kubenswrapper[4765]: I0319 10:33:31.658418 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"52c282ccaa9b12441cda9329a58912d852bd314df6cce7dba63b49f9309b4b08"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:33:31 crc kubenswrapper[4765]: I0319 10:33:31.658476 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://52c282ccaa9b12441cda9329a58912d852bd314df6cce7dba63b49f9309b4b08" gracePeriod=600 Mar 19 10:33:32 crc kubenswrapper[4765]: I0319 10:33:32.944360 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="52c282ccaa9b12441cda9329a58912d852bd314df6cce7dba63b49f9309b4b08" exitCode=0 Mar 19 10:33:32 crc kubenswrapper[4765]: I0319 10:33:32.944948 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"52c282ccaa9b12441cda9329a58912d852bd314df6cce7dba63b49f9309b4b08"} Mar 19 10:33:32 crc kubenswrapper[4765]: I0319 10:33:32.945038 4765 scope.go:117] "RemoveContainer" containerID="77d960355abace30efa3218c2c2218608f2c437a4f4180a38603f11b6f6f7a6e" Mar 19 10:33:33 crc kubenswrapper[4765]: I0319 10:33:33.955313 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"c126315de99fbe26aafdf378053a0eb9d09d2fb8e735089e0f39caceb743cb3e"} Mar 19 10:33:42 crc kubenswrapper[4765]: I0319 10:33:42.693950 4765 scope.go:117] "RemoveContainer" containerID="0bf0e12ea9fc117d627506b4888060ec90e30bfe7dff8e82eb28c4883e4b4cfd" Mar 19 10:33:42 crc kubenswrapper[4765]: I0319 10:33:42.729673 4765 scope.go:117] "RemoveContainer" containerID="5ae3e8e4fdb71e67ae4c0d05c82a17bea85182b47a02d646d519d113fbfa7698" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.148241 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565274-dcpdf"] Mar 19 10:34:00 crc kubenswrapper[4765]: E0319 10:34:00.149324 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24961073-0b01-4ce3-8929-e916408b3431" containerName="oc" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.149351 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="24961073-0b01-4ce3-8929-e916408b3431" containerName="oc" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.149532 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="24961073-0b01-4ce3-8929-e916408b3431" containerName="oc" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.150213 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-dcpdf" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.153920 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.153935 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.154420 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.155244 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-dcpdf"] Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.198217 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6ng7\" (UniqueName: \"kubernetes.io/projected/ddd89057-e049-46d9-823f-f38e8297b7fd-kube-api-access-p6ng7\") pod \"auto-csr-approver-29565274-dcpdf\" (UID: \"ddd89057-e049-46d9-823f-f38e8297b7fd\") " pod="openshift-infra/auto-csr-approver-29565274-dcpdf" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.299496 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6ng7\" (UniqueName: \"kubernetes.io/projected/ddd89057-e049-46d9-823f-f38e8297b7fd-kube-api-access-p6ng7\") pod \"auto-csr-approver-29565274-dcpdf\" (UID: \"ddd89057-e049-46d9-823f-f38e8297b7fd\") " pod="openshift-infra/auto-csr-approver-29565274-dcpdf" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.323050 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6ng7\" (UniqueName: \"kubernetes.io/projected/ddd89057-e049-46d9-823f-f38e8297b7fd-kube-api-access-p6ng7\") pod \"auto-csr-approver-29565274-dcpdf\" (UID: \"ddd89057-e049-46d9-823f-f38e8297b7fd\") " pod="openshift-infra/auto-csr-approver-29565274-dcpdf" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.468393 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-dcpdf" Mar 19 10:34:00 crc kubenswrapper[4765]: I0319 10:34:00.693429 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-dcpdf"] Mar 19 10:34:01 crc kubenswrapper[4765]: I0319 10:34:01.164801 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565274-dcpdf" event={"ID":"ddd89057-e049-46d9-823f-f38e8297b7fd","Type":"ContainerStarted","Data":"13d880210e0509ed6336229dfab6a4f930ae1cbf5231248ba723584616e19e16"} Mar 19 10:34:02 crc kubenswrapper[4765]: I0319 10:34:02.172186 4765 generic.go:334] "Generic (PLEG): container finished" podID="ddd89057-e049-46d9-823f-f38e8297b7fd" containerID="b60c0e4a6754265a9c7b14fd3467c8216f5d00dac9f3e0e7fa7e292fe00bf27f" exitCode=0 Mar 19 10:34:02 crc kubenswrapper[4765]: I0319 10:34:02.172278 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565274-dcpdf" event={"ID":"ddd89057-e049-46d9-823f-f38e8297b7fd","Type":"ContainerDied","Data":"b60c0e4a6754265a9c7b14fd3467c8216f5d00dac9f3e0e7fa7e292fe00bf27f"} Mar 19 10:34:03 crc kubenswrapper[4765]: I0319 10:34:03.426643 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-dcpdf" Mar 19 10:34:03 crc kubenswrapper[4765]: I0319 10:34:03.545417 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6ng7\" (UniqueName: \"kubernetes.io/projected/ddd89057-e049-46d9-823f-f38e8297b7fd-kube-api-access-p6ng7\") pod \"ddd89057-e049-46d9-823f-f38e8297b7fd\" (UID: \"ddd89057-e049-46d9-823f-f38e8297b7fd\") " Mar 19 10:34:03 crc kubenswrapper[4765]: I0319 10:34:03.553254 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ddd89057-e049-46d9-823f-f38e8297b7fd-kube-api-access-p6ng7" (OuterVolumeSpecName: "kube-api-access-p6ng7") pod "ddd89057-e049-46d9-823f-f38e8297b7fd" (UID: "ddd89057-e049-46d9-823f-f38e8297b7fd"). InnerVolumeSpecName "kube-api-access-p6ng7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:34:03 crc kubenswrapper[4765]: I0319 10:34:03.649354 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6ng7\" (UniqueName: \"kubernetes.io/projected/ddd89057-e049-46d9-823f-f38e8297b7fd-kube-api-access-p6ng7\") on node \"crc\" DevicePath \"\"" Mar 19 10:34:04 crc kubenswrapper[4765]: I0319 10:34:04.189402 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565274-dcpdf" event={"ID":"ddd89057-e049-46d9-823f-f38e8297b7fd","Type":"ContainerDied","Data":"13d880210e0509ed6336229dfab6a4f930ae1cbf5231248ba723584616e19e16"} Mar 19 10:34:04 crc kubenswrapper[4765]: I0319 10:34:04.189462 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565274-dcpdf" Mar 19 10:34:04 crc kubenswrapper[4765]: I0319 10:34:04.189469 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13d880210e0509ed6336229dfab6a4f930ae1cbf5231248ba723584616e19e16" Mar 19 10:34:04 crc kubenswrapper[4765]: I0319 10:34:04.488918 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-t2x8x"] Mar 19 10:34:04 crc kubenswrapper[4765]: I0319 10:34:04.493635 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565268-t2x8x"] Mar 19 10:34:06 crc kubenswrapper[4765]: I0319 10:34:06.363769 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6237c3c3-e25e-4b5d-8b7a-66198a313195" path="/var/lib/kubelet/pods/6237c3c3-e25e-4b5d-8b7a-66198a313195/volumes" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.919901 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tp66g"] Mar 19 10:35:07 crc kubenswrapper[4765]: E0319 10:35:07.920949 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ddd89057-e049-46d9-823f-f38e8297b7fd" containerName="oc" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.920982 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ddd89057-e049-46d9-823f-f38e8297b7fd" containerName="oc" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.921094 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ddd89057-e049-46d9-823f-f38e8297b7fd" containerName="oc" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.921670 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.924454 4765 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-vnfr6" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.924534 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.924572 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.927950 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-nb6dw"] Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.928874 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nb6dw" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.933017 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tp66g"] Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.934258 4765 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-m2wtn" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.942436 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nb6dw"] Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.972590 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8ld4\" (UniqueName: \"kubernetes.io/projected/4a6670fe-5988-4bfd-8468-b2a5f6cd9997-kube-api-access-m8ld4\") pod \"cert-manager-858654f9db-nb6dw\" (UID: \"4a6670fe-5988-4bfd-8468-b2a5f6cd9997\") " pod="cert-manager/cert-manager-858654f9db-nb6dw" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.972667 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sfm8l\" (UniqueName: \"kubernetes.io/projected/67148642-28c7-4217-b91a-3badb42c4c38-kube-api-access-sfm8l\") pod \"cert-manager-cainjector-cf98fcc89-tp66g\" (UID: \"67148642-28c7-4217-b91a-3badb42c4c38\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.998190 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-w862m"] Mar 19 10:35:07 crc kubenswrapper[4765]: I0319 10:35:07.999212 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.001889 4765 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-rf2h8" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.003322 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-w862m"] Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.074467 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8ld4\" (UniqueName: \"kubernetes.io/projected/4a6670fe-5988-4bfd-8468-b2a5f6cd9997-kube-api-access-m8ld4\") pod \"cert-manager-858654f9db-nb6dw\" (UID: \"4a6670fe-5988-4bfd-8468-b2a5f6cd9997\") " pod="cert-manager/cert-manager-858654f9db-nb6dw" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.074539 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ks9r\" (UniqueName: \"kubernetes.io/projected/1b499d05-d228-4268-8b1f-8b3c8687870f-kube-api-access-4ks9r\") pod \"cert-manager-webhook-687f57d79b-w862m\" (UID: \"1b499d05-d228-4268-8b1f-8b3c8687870f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.074627 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sfm8l\" (UniqueName: \"kubernetes.io/projected/67148642-28c7-4217-b91a-3badb42c4c38-kube-api-access-sfm8l\") pod \"cert-manager-cainjector-cf98fcc89-tp66g\" (UID: \"67148642-28c7-4217-b91a-3badb42c4c38\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.097420 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sfm8l\" (UniqueName: \"kubernetes.io/projected/67148642-28c7-4217-b91a-3badb42c4c38-kube-api-access-sfm8l\") pod \"cert-manager-cainjector-cf98fcc89-tp66g\" (UID: \"67148642-28c7-4217-b91a-3badb42c4c38\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.103192 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8ld4\" (UniqueName: \"kubernetes.io/projected/4a6670fe-5988-4bfd-8468-b2a5f6cd9997-kube-api-access-m8ld4\") pod \"cert-manager-858654f9db-nb6dw\" (UID: \"4a6670fe-5988-4bfd-8468-b2a5f6cd9997\") " pod="cert-manager/cert-manager-858654f9db-nb6dw" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.175558 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4ks9r\" (UniqueName: \"kubernetes.io/projected/1b499d05-d228-4268-8b1f-8b3c8687870f-kube-api-access-4ks9r\") pod \"cert-manager-webhook-687f57d79b-w862m\" (UID: \"1b499d05-d228-4268-8b1f-8b3c8687870f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.194185 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ks9r\" (UniqueName: \"kubernetes.io/projected/1b499d05-d228-4268-8b1f-8b3c8687870f-kube-api-access-4ks9r\") pod \"cert-manager-webhook-687f57d79b-w862m\" (UID: \"1b499d05-d228-4268-8b1f-8b3c8687870f\") " pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.248889 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.261818 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-nb6dw" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.313007 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" Mar 19 10:35:08 crc kubenswrapper[4765]: I0319 10:35:08.482818 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-nb6dw"] Mar 19 10:35:09 crc kubenswrapper[4765]: I0319 10:35:08.527377 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-tp66g"] Mar 19 10:35:09 crc kubenswrapper[4765]: W0319 10:35:08.533451 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod67148642_28c7_4217_b91a_3badb42c4c38.slice/crio-03c02062b46a6ecfb04ab64be84ec46798c26e17a7067d8eb5ab18fb6f4ffe29 WatchSource:0}: Error finding container 03c02062b46a6ecfb04ab64be84ec46798c26e17a7067d8eb5ab18fb6f4ffe29: Status 404 returned error can't find the container with id 03c02062b46a6ecfb04ab64be84ec46798c26e17a7067d8eb5ab18fb6f4ffe29 Mar 19 10:35:09 crc kubenswrapper[4765]: I0319 10:35:08.587254 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-w862m"] Mar 19 10:35:09 crc kubenswrapper[4765]: W0319 10:35:08.593671 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1b499d05_d228_4268_8b1f_8b3c8687870f.slice/crio-d440620030bd87c13dc09865a3ee5c70a2e6ca3e0f89530a3d0df85aff65dfe3 WatchSource:0}: Error finding container d440620030bd87c13dc09865a3ee5c70a2e6ca3e0f89530a3d0df85aff65dfe3: Status 404 returned error can't find the container with id d440620030bd87c13dc09865a3ee5c70a2e6ca3e0f89530a3d0df85aff65dfe3 Mar 19 10:35:09 crc kubenswrapper[4765]: I0319 10:35:08.628482 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nb6dw" event={"ID":"4a6670fe-5988-4bfd-8468-b2a5f6cd9997","Type":"ContainerStarted","Data":"a3959510e16dc7312176208245c313d10c9be97ddf13410261e37ed5a2810c19"} Mar 19 10:35:09 crc kubenswrapper[4765]: I0319 10:35:08.629885 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" event={"ID":"67148642-28c7-4217-b91a-3badb42c4c38","Type":"ContainerStarted","Data":"03c02062b46a6ecfb04ab64be84ec46798c26e17a7067d8eb5ab18fb6f4ffe29"} Mar 19 10:35:09 crc kubenswrapper[4765]: I0319 10:35:08.630880 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" event={"ID":"1b499d05-d228-4268-8b1f-8b3c8687870f","Type":"ContainerStarted","Data":"d440620030bd87c13dc09865a3ee5c70a2e6ca3e0f89530a3d0df85aff65dfe3"} Mar 19 10:35:13 crc kubenswrapper[4765]: I0319 10:35:13.669500 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" event={"ID":"1b499d05-d228-4268-8b1f-8b3c8687870f","Type":"ContainerStarted","Data":"8b9a00ded2c2e86f850be63dd5c6e0756821c20b5703915c38a10c6958a5e754"} Mar 19 10:35:13 crc kubenswrapper[4765]: I0319 10:35:13.670281 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" Mar 19 10:35:13 crc kubenswrapper[4765]: I0319 10:35:13.672174 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-nb6dw" event={"ID":"4a6670fe-5988-4bfd-8468-b2a5f6cd9997","Type":"ContainerStarted","Data":"1bc50af48658459447e04ad44de146203c9b25e8cbad2e7cea3e6def71deede1"} Mar 19 10:35:13 crc kubenswrapper[4765]: I0319 10:35:13.674850 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" event={"ID":"67148642-28c7-4217-b91a-3badb42c4c38","Type":"ContainerStarted","Data":"1cc785bd9d75c94c5ce3024108b6bd3d5c6e68ef0953d1035ed5e6a3b3cae96e"} Mar 19 10:35:13 crc kubenswrapper[4765]: I0319 10:35:13.688070 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" podStartSLOduration=2.604623545 podStartE2EDuration="6.688051007s" podCreationTimestamp="2026-03-19 10:35:07 +0000 UTC" firstStartedPulling="2026-03-19 10:35:08.597904142 +0000 UTC m=+806.946849684" lastFinishedPulling="2026-03-19 10:35:12.681331604 +0000 UTC m=+811.030277146" observedRunningTime="2026-03-19 10:35:13.687716278 +0000 UTC m=+812.036661830" watchObservedRunningTime="2026-03-19 10:35:13.688051007 +0000 UTC m=+812.036996539" Mar 19 10:35:13 crc kubenswrapper[4765]: I0319 10:35:13.705825 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-tp66g" podStartSLOduration=2.562195275 podStartE2EDuration="6.705799667s" podCreationTimestamp="2026-03-19 10:35:07 +0000 UTC" firstStartedPulling="2026-03-19 10:35:08.538947396 +0000 UTC m=+806.887892938" lastFinishedPulling="2026-03-19 10:35:12.682551788 +0000 UTC m=+811.031497330" observedRunningTime="2026-03-19 10:35:13.703574685 +0000 UTC m=+812.052520247" watchObservedRunningTime="2026-03-19 10:35:13.705799667 +0000 UTC m=+812.054745239" Mar 19 10:35:13 crc kubenswrapper[4765]: I0319 10:35:13.731662 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-nb6dw" podStartSLOduration=2.545831284 podStartE2EDuration="6.731625819s" podCreationTimestamp="2026-03-19 10:35:07 +0000 UTC" firstStartedPulling="2026-03-19 10:35:08.493564975 +0000 UTC m=+806.842510517" lastFinishedPulling="2026-03-19 10:35:12.67935951 +0000 UTC m=+811.028305052" observedRunningTime="2026-03-19 10:35:13.727708331 +0000 UTC m=+812.076653883" watchObservedRunningTime="2026-03-19 10:35:13.731625819 +0000 UTC m=+812.080571361" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.525305 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvv2h"] Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.526402 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="nbdb" containerID="cri-o://7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.526596 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="sbdb" containerID="cri-o://f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.526397 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-controller" containerID="cri-o://ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.526774 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="northd" containerID="cri-o://778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.526862 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.526938 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-node" containerID="cri-o://8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.526996 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-acl-logging" containerID="cri-o://071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.615419 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" containerID="cri-o://ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" gracePeriod=30 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.700155 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/2.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.700761 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/1.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.700816 4765 generic.go:334] "Generic (PLEG): container finished" podID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" containerID="e5956a936881882c7602c0fc752793d8c139ba7cbb4a3a5cd015552febec3d5b" exitCode=2 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.700916 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerDied","Data":"e5956a936881882c7602c0fc752793d8c139ba7cbb4a3a5cd015552febec3d5b"} Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.701000 4765 scope.go:117] "RemoveContainer" containerID="c7758269c24290d83fecce42e3d1f3e9569ca7c19b2c66970a979d98455cb6f2" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.701804 4765 scope.go:117] "RemoveContainer" containerID="e5956a936881882c7602c0fc752793d8c139ba7cbb4a3a5cd015552febec3d5b" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.702194 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mmrh7_openshift-multus(d9d027fd-4e70-4daf-9dd2-adefcc2a868f)\"" pod="openshift-multus/multus-mmrh7" podUID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.706034 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/3.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.727126 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovn-acl-logging/0.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.731151 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovn-controller/0.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.731907 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" exitCode=0 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.731977 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" exitCode=0 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.731993 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" exitCode=143 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.732004 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" exitCode=143 Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.732055 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0"} Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.732097 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367"} Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.732132 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88"} Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.732143 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2"} Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.863249 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/3.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.865864 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovn-acl-logging/0.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.866495 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovn-controller/0.log" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.867119 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914274 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-systemd\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914373 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-ovn-kubernetes\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914398 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-var-lib-openvswitch\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914448 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-log-socket\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914489 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914505 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q2sdr\" (UniqueName: \"kubernetes.io/projected/71cc276b-f25c-460b-b718-f058cc1d2521-kube-api-access-q2sdr\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914537 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-ovn\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914547 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-log-socket" (OuterVolumeSpecName: "log-socket") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914559 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-systemd-units\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914581 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-etc-openvswitch\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914683 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-kubelet\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914727 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71cc276b-f25c-460b-b718-f058cc1d2521-ovn-node-metrics-cert\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914791 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-config\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914836 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-slash\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914905 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-bin\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914940 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-node-log\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.914987 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-script-lib\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915024 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-netns\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915055 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-var-lib-cni-networks-ovn-kubernetes\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915086 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-env-overrides\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915131 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-openvswitch\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915176 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-netd\") pod \"71cc276b-f25c-460b-b718-f058cc1d2521\" (UID: \"71cc276b-f25c-460b-b718-f058cc1d2521\") " Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915219 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915296 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-slash" (OuterVolumeSpecName: "host-slash") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915322 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915345 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915383 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915410 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915432 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-node-log" (OuterVolumeSpecName: "node-log") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915587 4765 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-slash\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915613 4765 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915624 4765 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-node-log\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915634 4765 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915648 4765 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915676 4765 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915686 4765 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-log-socket\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915695 4765 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915704 4765 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915731 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915754 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915873 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915919 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.915955 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.916529 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.917644 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.918166 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.922917 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71cc276b-f25c-460b-b718-f058cc1d2521-kube-api-access-q2sdr" (OuterVolumeSpecName: "kube-api-access-q2sdr") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "kube-api-access-q2sdr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.928659 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71cc276b-f25c-460b-b718-f058cc1d2521-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.931396 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-xdfxh"] Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.931842 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.931869 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.931885 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.931894 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.931905 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.931913 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.931923 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="northd" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.931932 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="northd" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.931945 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.931953 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.931990 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-node" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932072 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-node" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.932089 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-acl-logging" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932098 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-acl-logging" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.932110 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932122 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.932132 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932140 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.932153 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932162 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.932173 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="nbdb" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932181 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="nbdb" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.932195 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kubecfg-setup" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932203 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kubecfg-setup" Mar 19 10:35:17 crc kubenswrapper[4765]: E0319 10:35:17.932215 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="sbdb" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932223 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="sbdb" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932345 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-node" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932357 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932368 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932377 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="northd" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932386 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932399 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="nbdb" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932409 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="sbdb" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932422 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932434 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="kube-rbac-proxy-ovn-metrics" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.932444 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovn-acl-logging" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.940992 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.941686 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" containerName="ovnkube-controller" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.948489 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "71cc276b-f25c-460b-b718-f058cc1d2521" (UID: "71cc276b-f25c-460b-b718-f058cc1d2521"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:35:17 crc kubenswrapper[4765]: I0319 10:35:17.949249 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016534 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-run-netns\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016582 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-cni-netd\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016601 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-env-overrides\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016663 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-ovn\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016690 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovn-node-metrics-cert\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016714 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016732 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovnkube-config\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016749 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-systemd\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016773 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-node-log\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016794 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-cni-bin\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.016812 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017001 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-run-ovn-kubernetes\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017020 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-var-lib-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017038 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-kubelet\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017053 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-log-socket\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017068 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovnkube-script-lib\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017084 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n49mb\" (UniqueName: \"kubernetes.io/projected/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-kube-api-access-n49mb\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017100 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-etc-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017253 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-systemd-units\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017349 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-slash\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017508 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017533 4765 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017547 4765 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017566 4765 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017580 4765 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017593 4765 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017605 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q2sdr\" (UniqueName: \"kubernetes.io/projected/71cc276b-f25c-460b-b718-f058cc1d2521-kube-api-access-q2sdr\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017617 4765 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017631 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/71cc276b-f25c-460b-b718-f058cc1d2521-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017643 4765 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/71cc276b-f25c-460b-b718-f058cc1d2521-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.017657 4765 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/71cc276b-f25c-460b-b718-f058cc1d2521-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118228 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-run-ovn-kubernetes\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118286 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-var-lib-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118405 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-run-ovn-kubernetes\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118411 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-kubelet\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118468 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-var-lib-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118442 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-kubelet\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118468 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-log-socket\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118546 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovnkube-script-lib\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118575 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n49mb\" (UniqueName: \"kubernetes.io/projected/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-kube-api-access-n49mb\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118594 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-log-socket\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118597 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-etc-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118622 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-etc-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118634 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-systemd-units\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118664 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-slash\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118694 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-run-netns\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118716 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-cni-netd\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118797 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-env-overrides\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118822 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-ovn\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118847 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovn-node-metrics-cert\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118875 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118899 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovnkube-config\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118926 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-systemd\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.118981 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-node-log\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119018 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-cni-bin\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119044 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119120 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-openvswitch\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119153 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-systemd-units\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119180 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-slash\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119208 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-run-netns\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119237 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-cni-netd\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119648 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-env-overrides\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119695 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-systemd\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119766 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-node-log\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119856 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119877 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-run-ovn\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.119941 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-host-cni-bin\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.120105 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovnkube-config\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.120385 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovnkube-script-lib\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.124211 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-ovn-node-metrics-cert\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.135782 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n49mb\" (UniqueName: \"kubernetes.io/projected/354d464a-a4d6-4f3e-83eb-ca5b2efd3f64-kube-api-access-n49mb\") pod \"ovnkube-node-xdfxh\" (UID: \"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64\") " pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.276733 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.318045 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-w862m" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.740057 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovnkube-controller/3.log" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.743622 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovn-acl-logging/0.log" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744205 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kvv2h_71cc276b-f25c-460b-b718-f058cc1d2521/ovn-controller/0.log" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744640 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" exitCode=0 Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744690 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" exitCode=0 Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744704 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" exitCode=0 Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744714 4765 generic.go:334] "Generic (PLEG): container finished" podID="71cc276b-f25c-460b-b718-f058cc1d2521" containerID="778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" exitCode=0 Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744733 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a"} Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744756 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744803 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a"} Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744839 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91"} Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744853 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2"} Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744863 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kvv2h" event={"ID":"71cc276b-f25c-460b-b718-f058cc1d2521","Type":"ContainerDied","Data":"7066a410bb56ff63006cf9b29ff7e8650033cc93539512bd69fa128f48deca6d"} Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.744899 4765 scope.go:117] "RemoveContainer" containerID="ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.746998 4765 generic.go:334] "Generic (PLEG): container finished" podID="354d464a-a4d6-4f3e-83eb-ca5b2efd3f64" containerID="074e3a3fb0197b6655278d3353ee71bda971e28072bd25c387ab44624287d157" exitCode=0 Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.747067 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerDied","Data":"074e3a3fb0197b6655278d3353ee71bda971e28072bd25c387ab44624287d157"} Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.747101 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"3a49a48ef54d4578763cc098506a4fcff41b6e9b3a78901f393a0d98273d642a"} Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.749625 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/2.log" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.761266 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.782846 4765 scope.go:117] "RemoveContainer" containerID="f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.805134 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvv2h"] Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.810164 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-kvv2h"] Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.816769 4765 scope.go:117] "RemoveContainer" containerID="7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.829167 4765 scope.go:117] "RemoveContainer" containerID="778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.859721 4765 scope.go:117] "RemoveContainer" containerID="5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.875308 4765 scope.go:117] "RemoveContainer" containerID="8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.889925 4765 scope.go:117] "RemoveContainer" containerID="071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.903802 4765 scope.go:117] "RemoveContainer" containerID="ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.932728 4765 scope.go:117] "RemoveContainer" containerID="ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.960049 4765 scope.go:117] "RemoveContainer" containerID="ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.960583 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": container with ID starting with ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a not found: ID does not exist" containerID="ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.960620 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a"} err="failed to get container status \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": rpc error: code = NotFound desc = could not find container \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": container with ID starting with ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.960649 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.961269 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": container with ID starting with fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9 not found: ID does not exist" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.961328 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9"} err="failed to get container status \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": rpc error: code = NotFound desc = could not find container \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": container with ID starting with fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.961366 4765 scope.go:117] "RemoveContainer" containerID="f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.961827 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": container with ID starting with f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a not found: ID does not exist" containerID="f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.961882 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a"} err="failed to get container status \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": rpc error: code = NotFound desc = could not find container \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": container with ID starting with f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.961927 4765 scope.go:117] "RemoveContainer" containerID="7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.962347 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": container with ID starting with 7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91 not found: ID does not exist" containerID="7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.962387 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91"} err="failed to get container status \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": rpc error: code = NotFound desc = could not find container \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": container with ID starting with 7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.962411 4765 scope.go:117] "RemoveContainer" containerID="778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.962676 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": container with ID starting with 778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2 not found: ID does not exist" containerID="778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.962705 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2"} err="failed to get container status \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": rpc error: code = NotFound desc = could not find container \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": container with ID starting with 778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.962722 4765 scope.go:117] "RemoveContainer" containerID="5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.963183 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": container with ID starting with 5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0 not found: ID does not exist" containerID="5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.963204 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0"} err="failed to get container status \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": rpc error: code = NotFound desc = could not find container \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": container with ID starting with 5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.963217 4765 scope.go:117] "RemoveContainer" containerID="8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.963499 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": container with ID starting with 8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367 not found: ID does not exist" containerID="8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.963533 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367"} err="failed to get container status \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": rpc error: code = NotFound desc = could not find container \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": container with ID starting with 8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.963554 4765 scope.go:117] "RemoveContainer" containerID="071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.963916 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": container with ID starting with 071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88 not found: ID does not exist" containerID="071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.963943 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88"} err="failed to get container status \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": rpc error: code = NotFound desc = could not find container \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": container with ID starting with 071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.963975 4765 scope.go:117] "RemoveContainer" containerID="ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.964267 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": container with ID starting with ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2 not found: ID does not exist" containerID="ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.964306 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2"} err="failed to get container status \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": rpc error: code = NotFound desc = could not find container \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": container with ID starting with ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.964324 4765 scope.go:117] "RemoveContainer" containerID="ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7" Mar 19 10:35:18 crc kubenswrapper[4765]: E0319 10:35:18.964601 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": container with ID starting with ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7 not found: ID does not exist" containerID="ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.964635 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7"} err="failed to get container status \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": rpc error: code = NotFound desc = could not find container \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": container with ID starting with ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.964653 4765 scope.go:117] "RemoveContainer" containerID="ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.964936 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a"} err="failed to get container status \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": rpc error: code = NotFound desc = could not find container \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": container with ID starting with ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.964984 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.965291 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9"} err="failed to get container status \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": rpc error: code = NotFound desc = could not find container \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": container with ID starting with fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.965315 4765 scope.go:117] "RemoveContainer" containerID="f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.965621 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a"} err="failed to get container status \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": rpc error: code = NotFound desc = could not find container \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": container with ID starting with f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.965646 4765 scope.go:117] "RemoveContainer" containerID="7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.965882 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91"} err="failed to get container status \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": rpc error: code = NotFound desc = could not find container \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": container with ID starting with 7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.965902 4765 scope.go:117] "RemoveContainer" containerID="778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.966170 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2"} err="failed to get container status \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": rpc error: code = NotFound desc = could not find container \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": container with ID starting with 778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.966198 4765 scope.go:117] "RemoveContainer" containerID="5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.966519 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0"} err="failed to get container status \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": rpc error: code = NotFound desc = could not find container \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": container with ID starting with 5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.966547 4765 scope.go:117] "RemoveContainer" containerID="8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.966804 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367"} err="failed to get container status \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": rpc error: code = NotFound desc = could not find container \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": container with ID starting with 8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.966829 4765 scope.go:117] "RemoveContainer" containerID="071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.967158 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88"} err="failed to get container status \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": rpc error: code = NotFound desc = could not find container \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": container with ID starting with 071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.967180 4765 scope.go:117] "RemoveContainer" containerID="ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.967557 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2"} err="failed to get container status \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": rpc error: code = NotFound desc = could not find container \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": container with ID starting with ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.967583 4765 scope.go:117] "RemoveContainer" containerID="ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.967839 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7"} err="failed to get container status \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": rpc error: code = NotFound desc = could not find container \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": container with ID starting with ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.967861 4765 scope.go:117] "RemoveContainer" containerID="ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968113 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a"} err="failed to get container status \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": rpc error: code = NotFound desc = could not find container \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": container with ID starting with ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968135 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968353 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9"} err="failed to get container status \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": rpc error: code = NotFound desc = could not find container \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": container with ID starting with fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968380 4765 scope.go:117] "RemoveContainer" containerID="f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968596 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a"} err="failed to get container status \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": rpc error: code = NotFound desc = could not find container \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": container with ID starting with f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968619 4765 scope.go:117] "RemoveContainer" containerID="7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968829 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91"} err="failed to get container status \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": rpc error: code = NotFound desc = could not find container \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": container with ID starting with 7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.968848 4765 scope.go:117] "RemoveContainer" containerID="778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.969063 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2"} err="failed to get container status \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": rpc error: code = NotFound desc = could not find container \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": container with ID starting with 778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.969084 4765 scope.go:117] "RemoveContainer" containerID="5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.969702 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0"} err="failed to get container status \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": rpc error: code = NotFound desc = could not find container \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": container with ID starting with 5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.969722 4765 scope.go:117] "RemoveContainer" containerID="8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.970093 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367"} err="failed to get container status \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": rpc error: code = NotFound desc = could not find container \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": container with ID starting with 8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.970119 4765 scope.go:117] "RemoveContainer" containerID="071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.970528 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88"} err="failed to get container status \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": rpc error: code = NotFound desc = could not find container \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": container with ID starting with 071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.970553 4765 scope.go:117] "RemoveContainer" containerID="ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.973116 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2"} err="failed to get container status \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": rpc error: code = NotFound desc = could not find container \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": container with ID starting with ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.973147 4765 scope.go:117] "RemoveContainer" containerID="ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.973465 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7"} err="failed to get container status \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": rpc error: code = NotFound desc = could not find container \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": container with ID starting with ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.973561 4765 scope.go:117] "RemoveContainer" containerID="ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.973922 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a"} err="failed to get container status \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": rpc error: code = NotFound desc = could not find container \"ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a\": container with ID starting with ad0c4725c0c0febca77d2667b7d049629e519cc7e1bdd27f38983970da7a1d3a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.973984 4765 scope.go:117] "RemoveContainer" containerID="fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.974250 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9"} err="failed to get container status \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": rpc error: code = NotFound desc = could not find container \"fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9\": container with ID starting with fd2cd65a4c6b2ee83d5465478e129c0ef1316ac68f8de6e33aed996e38fd75d9 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.974277 4765 scope.go:117] "RemoveContainer" containerID="f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.974490 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a"} err="failed to get container status \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": rpc error: code = NotFound desc = could not find container \"f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a\": container with ID starting with f3cfcb267511e94a2f85e51f2d70501158fb86cf555191a1567ddbd5eefde01a not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.974515 4765 scope.go:117] "RemoveContainer" containerID="7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.974817 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91"} err="failed to get container status \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": rpc error: code = NotFound desc = could not find container \"7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91\": container with ID starting with 7e02dad7ebd37050944ebb4c2037ffa11a6cbaabc323dab8637dca1c7adb7e91 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.974837 4765 scope.go:117] "RemoveContainer" containerID="778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.975164 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2"} err="failed to get container status \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": rpc error: code = NotFound desc = could not find container \"778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2\": container with ID starting with 778ee636c60c31e250ed9eff09d46393059ce9a83be7a081846d0b0a045631b2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.975182 4765 scope.go:117] "RemoveContainer" containerID="5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.975619 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0"} err="failed to get container status \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": rpc error: code = NotFound desc = could not find container \"5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0\": container with ID starting with 5613f9dd44a567bdbe1eda55d93e494b39bfece94fcbe64efe9b9a40732c5af0 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.975638 4765 scope.go:117] "RemoveContainer" containerID="8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.975859 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367"} err="failed to get container status \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": rpc error: code = NotFound desc = could not find container \"8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367\": container with ID starting with 8f504f0d2fba68eaed4c1b0d8d6782dc82a3111a7acc404ceca9feb1e504a367 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.975879 4765 scope.go:117] "RemoveContainer" containerID="071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.976124 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88"} err="failed to get container status \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": rpc error: code = NotFound desc = could not find container \"071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88\": container with ID starting with 071194e4b4b8b247daa3d635281ced4152e74e5ac4caa45f2a3a351ec3dcde88 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.976143 4765 scope.go:117] "RemoveContainer" containerID="ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.976301 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2"} err="failed to get container status \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": rpc error: code = NotFound desc = could not find container \"ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2\": container with ID starting with ac78047429857acbb34cf4b0a039cf38c741cbe82d91f67a1a62c5769732f6a2 not found: ID does not exist" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.976325 4765 scope.go:117] "RemoveContainer" containerID="ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7" Mar 19 10:35:18 crc kubenswrapper[4765]: I0319 10:35:18.976479 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7"} err="failed to get container status \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": rpc error: code = NotFound desc = could not find container \"ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7\": container with ID starting with ca528eab846e07e965bf6727b0af373c5aa49460f9abf0802e5e0f7876cb21d7 not found: ID does not exist" Mar 19 10:35:19 crc kubenswrapper[4765]: I0319 10:35:19.761109 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"c5058d75f5b928c7f09eab43d3b6f333beee8725f974f0f133c86dd3b5ec237a"} Mar 19 10:35:19 crc kubenswrapper[4765]: I0319 10:35:19.761510 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"00f243378321f00282acfd4edd95baa5500694faa814df8e16b59c9933c9dd0b"} Mar 19 10:35:19 crc kubenswrapper[4765]: I0319 10:35:19.761523 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"e8f4cdf95306ed1f484ac58ce1af47d29521e604c3ac70834de25f95ef1b620a"} Mar 19 10:35:19 crc kubenswrapper[4765]: I0319 10:35:19.761532 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"45375b738523a632892715370bcb5d69c961e4220fec5a6243e5dd8275c1201f"} Mar 19 10:35:19 crc kubenswrapper[4765]: I0319 10:35:19.761542 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"1133bb2aef58cb409aa231f185e01067081cdc0c0e45173a1711f8086ccd2454"} Mar 19 10:35:19 crc kubenswrapper[4765]: I0319 10:35:19.761554 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"66be4cb1bf132b1457794f2772f9bec922b5cee42507ae9f18aa4761576def0f"} Mar 19 10:35:20 crc kubenswrapper[4765]: I0319 10:35:20.369272 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71cc276b-f25c-460b-b718-f058cc1d2521" path="/var/lib/kubelet/pods/71cc276b-f25c-460b-b718-f058cc1d2521/volumes" Mar 19 10:35:21 crc kubenswrapper[4765]: I0319 10:35:21.785695 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"79f75bd8ece7bb7b51afc7dbe4f56a2be920fb86952856e5dfb1ca5ebf3382f0"} Mar 19 10:35:24 crc kubenswrapper[4765]: I0319 10:35:24.805976 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" event={"ID":"354d464a-a4d6-4f3e-83eb-ca5b2efd3f64","Type":"ContainerStarted","Data":"6862ec073979d57b9ce36a94840dc215545736463c5eac6e2adce756e09d74f9"} Mar 19 10:35:24 crc kubenswrapper[4765]: I0319 10:35:24.807107 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:24 crc kubenswrapper[4765]: I0319 10:35:24.807122 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:24 crc kubenswrapper[4765]: I0319 10:35:24.839066 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" podStartSLOduration=7.83904263 podStartE2EDuration="7.83904263s" podCreationTimestamp="2026-03-19 10:35:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:35:24.838173566 +0000 UTC m=+823.187119138" watchObservedRunningTime="2026-03-19 10:35:24.83904263 +0000 UTC m=+823.187988172" Mar 19 10:35:24 crc kubenswrapper[4765]: I0319 10:35:24.844372 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:25 crc kubenswrapper[4765]: I0319 10:35:25.813059 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:25 crc kubenswrapper[4765]: I0319 10:35:25.849911 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:31 crc kubenswrapper[4765]: I0319 10:35:31.356556 4765 scope.go:117] "RemoveContainer" containerID="e5956a936881882c7602c0fc752793d8c139ba7cbb4a3a5cd015552febec3d5b" Mar 19 10:35:31 crc kubenswrapper[4765]: E0319 10:35:31.357401 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-mmrh7_openshift-multus(d9d027fd-4e70-4daf-9dd2-adefcc2a868f)\"" pod="openshift-multus/multus-mmrh7" podUID="d9d027fd-4e70-4daf-9dd2-adefcc2a868f" Mar 19 10:35:42 crc kubenswrapper[4765]: I0319 10:35:42.804272 4765 scope.go:117] "RemoveContainer" containerID="14a43eed4dfe0f9e563278b73ce4359443e0ce2531c928ab9e6ade729c8a7be6" Mar 19 10:35:42 crc kubenswrapper[4765]: I0319 10:35:42.846072 4765 scope.go:117] "RemoveContainer" containerID="91fe485b8c9e70039e92cd64618ac48ec8c86780ab6016d0913dda548ac314fc" Mar 19 10:35:43 crc kubenswrapper[4765]: I0319 10:35:43.356847 4765 scope.go:117] "RemoveContainer" containerID="e5956a936881882c7602c0fc752793d8c139ba7cbb4a3a5cd015552febec3d5b" Mar 19 10:35:43 crc kubenswrapper[4765]: I0319 10:35:43.951692 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-mmrh7_d9d027fd-4e70-4daf-9dd2-adefcc2a868f/kube-multus/2.log" Mar 19 10:35:43 crc kubenswrapper[4765]: I0319 10:35:43.951801 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-mmrh7" event={"ID":"d9d027fd-4e70-4daf-9dd2-adefcc2a868f","Type":"ContainerStarted","Data":"b669d11dd35cd2122689f5bf6ceeeca62944aef19be73d1244f6edb8ef31b63d"} Mar 19 10:35:48 crc kubenswrapper[4765]: I0319 10:35:48.309059 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xdfxh" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.530926 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z"] Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.533028 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.534792 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.538833 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z"] Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.549975 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xpjr\" (UniqueName: \"kubernetes.io/projected/2bd39f53-5aca-44bd-93ed-bff9ffafb381-kube-api-access-8xpjr\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.550036 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.550090 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.651161 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.651247 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.651283 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xpjr\" (UniqueName: \"kubernetes.io/projected/2bd39f53-5aca-44bd-93ed-bff9ffafb381-kube-api-access-8xpjr\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.651726 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.651755 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.670542 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xpjr\" (UniqueName: \"kubernetes.io/projected/2bd39f53-5aca-44bd-93ed-bff9ffafb381-kube-api-access-8xpjr\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:35:59 crc kubenswrapper[4765]: I0319 10:35:59.848044 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.100073 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z"] Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.138838 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565276-tfhl8"] Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.140637 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.143699 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.144035 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.144262 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.147033 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-tfhl8"] Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.158574 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-snmtw\" (UniqueName: \"kubernetes.io/projected/0d2bf5f7-9350-4366-8930-8a9383045a69-kube-api-access-snmtw\") pod \"auto-csr-approver-29565276-tfhl8\" (UID: \"0d2bf5f7-9350-4366-8930-8a9383045a69\") " pod="openshift-infra/auto-csr-approver-29565276-tfhl8" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.259667 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-snmtw\" (UniqueName: \"kubernetes.io/projected/0d2bf5f7-9350-4366-8930-8a9383045a69-kube-api-access-snmtw\") pod \"auto-csr-approver-29565276-tfhl8\" (UID: \"0d2bf5f7-9350-4366-8930-8a9383045a69\") " pod="openshift-infra/auto-csr-approver-29565276-tfhl8" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.278326 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-snmtw\" (UniqueName: \"kubernetes.io/projected/0d2bf5f7-9350-4366-8930-8a9383045a69-kube-api-access-snmtw\") pod \"auto-csr-approver-29565276-tfhl8\" (UID: \"0d2bf5f7-9350-4366-8930-8a9383045a69\") " pod="openshift-infra/auto-csr-approver-29565276-tfhl8" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.496617 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" Mar 19 10:36:00 crc kubenswrapper[4765]: I0319 10:36:00.676241 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-tfhl8"] Mar 19 10:36:00 crc kubenswrapper[4765]: W0319 10:36:00.690316 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d2bf5f7_9350_4366_8930_8a9383045a69.slice/crio-3644cd096b882bdbf088b737278abc097ef1031cf515777c2337c08634c0f28c WatchSource:0}: Error finding container 3644cd096b882bdbf088b737278abc097ef1031cf515777c2337c08634c0f28c: Status 404 returned error can't find the container with id 3644cd096b882bdbf088b737278abc097ef1031cf515777c2337c08634c0f28c Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.073793 4765 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.080025 4765 generic.go:334] "Generic (PLEG): container finished" podID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerID="3117238f832d3e3b859d594de6215f5048ecded12480153f7d284d06b6266a7e" exitCode=0 Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.080088 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" event={"ID":"2bd39f53-5aca-44bd-93ed-bff9ffafb381","Type":"ContainerDied","Data":"3117238f832d3e3b859d594de6215f5048ecded12480153f7d284d06b6266a7e"} Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.080174 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" event={"ID":"2bd39f53-5aca-44bd-93ed-bff9ffafb381","Type":"ContainerStarted","Data":"cb2b67bc1a90b3db3091a55ef6efca0a57fffa215ec718d96b951d7f839293c8"} Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.081059 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" event={"ID":"0d2bf5f7-9350-4366-8930-8a9383045a69","Type":"ContainerStarted","Data":"3644cd096b882bdbf088b737278abc097ef1031cf515777c2337c08634c0f28c"} Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.635183 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-brkb2"] Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.636686 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.646397 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brkb2"] Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.660321 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.661167 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.680749 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzlvc\" (UniqueName: \"kubernetes.io/projected/c386e3c2-2465-4d85-a424-2ff9b5489178-kube-api-access-qzlvc\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.681141 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-catalog-content\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.681199 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-utilities\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.782193 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzlvc\" (UniqueName: \"kubernetes.io/projected/c386e3c2-2465-4d85-a424-2ff9b5489178-kube-api-access-qzlvc\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.782247 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-catalog-content\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.782287 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-utilities\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.782783 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-utilities\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.783122 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-catalog-content\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.809685 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzlvc\" (UniqueName: \"kubernetes.io/projected/c386e3c2-2465-4d85-a424-2ff9b5489178-kube-api-access-qzlvc\") pod \"redhat-operators-brkb2\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:01 crc kubenswrapper[4765]: I0319 10:36:01.994776 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:02 crc kubenswrapper[4765]: I0319 10:36:02.094327 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" event={"ID":"0d2bf5f7-9350-4366-8930-8a9383045a69","Type":"ContainerStarted","Data":"2a25fb65b9413294faba5a046fe314e6432042e0f75e502bbf896a2043d09937"} Mar 19 10:36:02 crc kubenswrapper[4765]: I0319 10:36:02.148314 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" podStartSLOduration=1.067076614 podStartE2EDuration="2.14827651s" podCreationTimestamp="2026-03-19 10:36:00 +0000 UTC" firstStartedPulling="2026-03-19 10:36:00.693183107 +0000 UTC m=+859.042128649" lastFinishedPulling="2026-03-19 10:36:01.774383003 +0000 UTC m=+860.123328545" observedRunningTime="2026-03-19 10:36:02.140925727 +0000 UTC m=+860.489871279" watchObservedRunningTime="2026-03-19 10:36:02.14827651 +0000 UTC m=+860.497222052" Mar 19 10:36:02 crc kubenswrapper[4765]: I0319 10:36:02.275976 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-brkb2"] Mar 19 10:36:03 crc kubenswrapper[4765]: I0319 10:36:03.102510 4765 generic.go:334] "Generic (PLEG): container finished" podID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerID="a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852" exitCode=0 Mar 19 10:36:03 crc kubenswrapper[4765]: I0319 10:36:03.102744 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brkb2" event={"ID":"c386e3c2-2465-4d85-a424-2ff9b5489178","Type":"ContainerDied","Data":"a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852"} Mar 19 10:36:03 crc kubenswrapper[4765]: I0319 10:36:03.102996 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brkb2" event={"ID":"c386e3c2-2465-4d85-a424-2ff9b5489178","Type":"ContainerStarted","Data":"5fe4b0b8267bbebffa70bc0fa3beaa6e94284ef77290a6417aea3c4b00fbe585"} Mar 19 10:36:03 crc kubenswrapper[4765]: I0319 10:36:03.105621 4765 generic.go:334] "Generic (PLEG): container finished" podID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerID="98c1194704f941d2bad267d72e5b7b2aa375a811bcca2f8d9ec6be5f0b98aa3a" exitCode=0 Mar 19 10:36:03 crc kubenswrapper[4765]: I0319 10:36:03.105684 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" event={"ID":"2bd39f53-5aca-44bd-93ed-bff9ffafb381","Type":"ContainerDied","Data":"98c1194704f941d2bad267d72e5b7b2aa375a811bcca2f8d9ec6be5f0b98aa3a"} Mar 19 10:36:03 crc kubenswrapper[4765]: I0319 10:36:03.108875 4765 generic.go:334] "Generic (PLEG): container finished" podID="0d2bf5f7-9350-4366-8930-8a9383045a69" containerID="2a25fb65b9413294faba5a046fe314e6432042e0f75e502bbf896a2043d09937" exitCode=0 Mar 19 10:36:03 crc kubenswrapper[4765]: I0319 10:36:03.108906 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" event={"ID":"0d2bf5f7-9350-4366-8930-8a9383045a69","Type":"ContainerDied","Data":"2a25fb65b9413294faba5a046fe314e6432042e0f75e502bbf896a2043d09937"} Mar 19 10:36:04 crc kubenswrapper[4765]: I0319 10:36:04.122380 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" event={"ID":"2bd39f53-5aca-44bd-93ed-bff9ffafb381","Type":"ContainerDied","Data":"f140986e6be0ff89fa830e5dc1adcbe7e5c4545af780a994f6edb4e5dbb1e34e"} Mar 19 10:36:04 crc kubenswrapper[4765]: I0319 10:36:04.122784 4765 generic.go:334] "Generic (PLEG): container finished" podID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerID="f140986e6be0ff89fa830e5dc1adcbe7e5c4545af780a994f6edb4e5dbb1e34e" exitCode=0 Mar 19 10:36:04 crc kubenswrapper[4765]: I0319 10:36:04.472764 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" Mar 19 10:36:04 crc kubenswrapper[4765]: I0319 10:36:04.625695 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-snmtw\" (UniqueName: \"kubernetes.io/projected/0d2bf5f7-9350-4366-8930-8a9383045a69-kube-api-access-snmtw\") pod \"0d2bf5f7-9350-4366-8930-8a9383045a69\" (UID: \"0d2bf5f7-9350-4366-8930-8a9383045a69\") " Mar 19 10:36:04 crc kubenswrapper[4765]: I0319 10:36:04.632402 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2bf5f7-9350-4366-8930-8a9383045a69-kube-api-access-snmtw" (OuterVolumeSpecName: "kube-api-access-snmtw") pod "0d2bf5f7-9350-4366-8930-8a9383045a69" (UID: "0d2bf5f7-9350-4366-8930-8a9383045a69"). InnerVolumeSpecName "kube-api-access-snmtw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:36:04 crc kubenswrapper[4765]: I0319 10:36:04.727621 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-snmtw\" (UniqueName: \"kubernetes.io/projected/0d2bf5f7-9350-4366-8930-8a9383045a69-kube-api-access-snmtw\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.132628 4765 generic.go:334] "Generic (PLEG): container finished" podID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerID="dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96" exitCode=0 Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.132745 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brkb2" event={"ID":"c386e3c2-2465-4d85-a424-2ff9b5489178","Type":"ContainerDied","Data":"dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96"} Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.134799 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" event={"ID":"0d2bf5f7-9350-4366-8930-8a9383045a69","Type":"ContainerDied","Data":"3644cd096b882bdbf088b737278abc097ef1031cf515777c2337c08634c0f28c"} Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.134852 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3644cd096b882bdbf088b737278abc097ef1031cf515777c2337c08634c0f28c" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.134813 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565276-tfhl8" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.214018 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-xhklx"] Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.219688 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565270-xhklx"] Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.400044 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.537625 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-bundle\") pod \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.537718 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8xpjr\" (UniqueName: \"kubernetes.io/projected/2bd39f53-5aca-44bd-93ed-bff9ffafb381-kube-api-access-8xpjr\") pod \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.537826 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-util\") pod \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\" (UID: \"2bd39f53-5aca-44bd-93ed-bff9ffafb381\") " Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.538653 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-bundle" (OuterVolumeSpecName: "bundle") pod "2bd39f53-5aca-44bd-93ed-bff9ffafb381" (UID: "2bd39f53-5aca-44bd-93ed-bff9ffafb381"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.541727 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd39f53-5aca-44bd-93ed-bff9ffafb381-kube-api-access-8xpjr" (OuterVolumeSpecName: "kube-api-access-8xpjr") pod "2bd39f53-5aca-44bd-93ed-bff9ffafb381" (UID: "2bd39f53-5aca-44bd-93ed-bff9ffafb381"). InnerVolumeSpecName "kube-api-access-8xpjr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.554748 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-util" (OuterVolumeSpecName: "util") pod "2bd39f53-5aca-44bd-93ed-bff9ffafb381" (UID: "2bd39f53-5aca-44bd-93ed-bff9ffafb381"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.639867 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-util\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.639913 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/2bd39f53-5aca-44bd-93ed-bff9ffafb381-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:05 crc kubenswrapper[4765]: I0319 10:36:05.639925 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8xpjr\" (UniqueName: \"kubernetes.io/projected/2bd39f53-5aca-44bd-93ed-bff9ffafb381-kube-api-access-8xpjr\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:06 crc kubenswrapper[4765]: I0319 10:36:06.143555 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" event={"ID":"2bd39f53-5aca-44bd-93ed-bff9ffafb381","Type":"ContainerDied","Data":"cb2b67bc1a90b3db3091a55ef6efca0a57fffa215ec718d96b951d7f839293c8"} Mar 19 10:36:06 crc kubenswrapper[4765]: I0319 10:36:06.143825 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb2b67bc1a90b3db3091a55ef6efca0a57fffa215ec718d96b951d7f839293c8" Mar 19 10:36:06 crc kubenswrapper[4765]: I0319 10:36:06.143569 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z" Mar 19 10:36:06 crc kubenswrapper[4765]: I0319 10:36:06.146556 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brkb2" event={"ID":"c386e3c2-2465-4d85-a424-2ff9b5489178","Type":"ContainerStarted","Data":"a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c"} Mar 19 10:36:06 crc kubenswrapper[4765]: I0319 10:36:06.166223 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-brkb2" podStartSLOduration=2.724191964 podStartE2EDuration="5.166197256s" podCreationTimestamp="2026-03-19 10:36:01 +0000 UTC" firstStartedPulling="2026-03-19 10:36:03.105101097 +0000 UTC m=+861.454046639" lastFinishedPulling="2026-03-19 10:36:05.547106379 +0000 UTC m=+863.896051931" observedRunningTime="2026-03-19 10:36:06.165667622 +0000 UTC m=+864.514613164" watchObservedRunningTime="2026-03-19 10:36:06.166197256 +0000 UTC m=+864.515142808" Mar 19 10:36:06 crc kubenswrapper[4765]: I0319 10:36:06.363735 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="665575bc-6ce3-4177-9229-2fe41e45fced" path="/var/lib/kubelet/pods/665575bc-6ce3-4177-9229-2fe41e45fced/volumes" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.954638 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5"] Mar 19 10:36:09 crc kubenswrapper[4765]: E0319 10:36:09.955324 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2bf5f7-9350-4366-8930-8a9383045a69" containerName="oc" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.955337 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2bf5f7-9350-4366-8930-8a9383045a69" containerName="oc" Mar 19 10:36:09 crc kubenswrapper[4765]: E0319 10:36:09.955351 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerName="pull" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.955357 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerName="pull" Mar 19 10:36:09 crc kubenswrapper[4765]: E0319 10:36:09.955367 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerName="extract" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.955376 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerName="extract" Mar 19 10:36:09 crc kubenswrapper[4765]: E0319 10:36:09.955388 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerName="util" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.955394 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerName="util" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.955502 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2bd39f53-5aca-44bd-93ed-bff9ffafb381" containerName="extract" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.955511 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2bf5f7-9350-4366-8930-8a9383045a69" containerName="oc" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.955939 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.960302 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-rfzl9" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.961752 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.961915 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 19 10:36:09 crc kubenswrapper[4765]: I0319 10:36:09.973433 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5"] Mar 19 10:36:10 crc kubenswrapper[4765]: I0319 10:36:10.096580 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cpsp\" (UniqueName: \"kubernetes.io/projected/75557987-e600-4f26-b66a-45a76da143cf-kube-api-access-5cpsp\") pod \"nmstate-operator-796d4cfff4-9bsj5\" (UID: \"75557987-e600-4f26-b66a-45a76da143cf\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" Mar 19 10:36:10 crc kubenswrapper[4765]: I0319 10:36:10.198459 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5cpsp\" (UniqueName: \"kubernetes.io/projected/75557987-e600-4f26-b66a-45a76da143cf-kube-api-access-5cpsp\") pod \"nmstate-operator-796d4cfff4-9bsj5\" (UID: \"75557987-e600-4f26-b66a-45a76da143cf\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" Mar 19 10:36:10 crc kubenswrapper[4765]: I0319 10:36:10.237287 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5cpsp\" (UniqueName: \"kubernetes.io/projected/75557987-e600-4f26-b66a-45a76da143cf-kube-api-access-5cpsp\") pod \"nmstate-operator-796d4cfff4-9bsj5\" (UID: \"75557987-e600-4f26-b66a-45a76da143cf\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" Mar 19 10:36:10 crc kubenswrapper[4765]: I0319 10:36:10.272882 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" Mar 19 10:36:10 crc kubenswrapper[4765]: I0319 10:36:10.521718 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5"] Mar 19 10:36:10 crc kubenswrapper[4765]: W0319 10:36:10.524434 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75557987_e600_4f26_b66a_45a76da143cf.slice/crio-65ab3683e1dd5ccf3d78c535262e8faa8719253949b41724afabf053c05ae26c WatchSource:0}: Error finding container 65ab3683e1dd5ccf3d78c535262e8faa8719253949b41724afabf053c05ae26c: Status 404 returned error can't find the container with id 65ab3683e1dd5ccf3d78c535262e8faa8719253949b41724afabf053c05ae26c Mar 19 10:36:11 crc kubenswrapper[4765]: I0319 10:36:11.183562 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" event={"ID":"75557987-e600-4f26-b66a-45a76da143cf","Type":"ContainerStarted","Data":"65ab3683e1dd5ccf3d78c535262e8faa8719253949b41724afabf053c05ae26c"} Mar 19 10:36:11 crc kubenswrapper[4765]: I0319 10:36:11.995157 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:11 crc kubenswrapper[4765]: I0319 10:36:11.995620 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:13 crc kubenswrapper[4765]: I0319 10:36:13.039155 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-brkb2" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="registry-server" probeResult="failure" output=< Mar 19 10:36:13 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 10:36:13 crc kubenswrapper[4765]: > Mar 19 10:36:14 crc kubenswrapper[4765]: I0319 10:36:14.206996 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" event={"ID":"75557987-e600-4f26-b66a-45a76da143cf","Type":"ContainerStarted","Data":"52bb3824abc4f5fe872d50183184274f24fe5b31151a877ed3e81c074c7d53b6"} Mar 19 10:36:14 crc kubenswrapper[4765]: I0319 10:36:14.223385 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-9bsj5" podStartSLOduration=2.236866086 podStartE2EDuration="5.223365777s" podCreationTimestamp="2026-03-19 10:36:09 +0000 UTC" firstStartedPulling="2026-03-19 10:36:10.529641729 +0000 UTC m=+868.878587271" lastFinishedPulling="2026-03-19 10:36:13.51614142 +0000 UTC m=+871.865086962" observedRunningTime="2026-03-19 10:36:14.223131001 +0000 UTC m=+872.572076593" watchObservedRunningTime="2026-03-19 10:36:14.223365777 +0000 UTC m=+872.572311319" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.606426 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc"] Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.607834 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.615734 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-h2nxb" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.623773 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7"] Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.624829 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.629409 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc"] Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.631915 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.657261 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7"] Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.668600 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-n2hqs"] Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.669461 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.732873 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stw9v\" (UniqueName: \"kubernetes.io/projected/ac4e199b-7261-41f0-b9e9-51b167be05a7-kube-api-access-stw9v\") pod \"nmstate-webhook-5f558f5558-rl6q7\" (UID: \"ac4e199b-7261-41f0-b9e9-51b167be05a7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.733141 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6p2bg\" (UniqueName: \"kubernetes.io/projected/5a5acf7a-e38f-4ef1-9576-eae0d8e9a582-kube-api-access-6p2bg\") pod \"nmstate-metrics-9b8c8685d-z6jbc\" (UID: \"5a5acf7a-e38f-4ef1-9576-eae0d8e9a582\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.733321 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ac4e199b-7261-41f0-b9e9-51b167be05a7-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rl6q7\" (UID: \"ac4e199b-7261-41f0-b9e9-51b167be05a7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.786700 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf"] Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.787800 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.791436 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.791708 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.791985 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-brn8c" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.796254 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf"] Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.834738 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-dbus-socket\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.834797 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76583484-f8aa-4a95-8450-206a93fb2b6c-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.834820 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-nmstate-lock\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.834896 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ac4e199b-7261-41f0-b9e9-51b167be05a7-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rl6q7\" (UID: \"ac4e199b-7261-41f0-b9e9-51b167be05a7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.834930 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stw9v\" (UniqueName: \"kubernetes.io/projected/ac4e199b-7261-41f0-b9e9-51b167be05a7-kube-api-access-stw9v\") pod \"nmstate-webhook-5f558f5558-rl6q7\" (UID: \"ac4e199b-7261-41f0-b9e9-51b167be05a7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.834951 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76583484-f8aa-4a95-8450-206a93fb2b6c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.835001 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9zbg9\" (UniqueName: \"kubernetes.io/projected/e3d573a8-f56f-45e5-9905-5810e82af6ac-kube-api-access-9zbg9\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.835048 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6p2bg\" (UniqueName: \"kubernetes.io/projected/5a5acf7a-e38f-4ef1-9576-eae0d8e9a582-kube-api-access-6p2bg\") pod \"nmstate-metrics-9b8c8685d-z6jbc\" (UID: \"5a5acf7a-e38f-4ef1-9576-eae0d8e9a582\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" Mar 19 10:36:19 crc kubenswrapper[4765]: E0319 10:36:19.835050 4765 secret.go:188] Couldn't get secret openshift-nmstate/openshift-nmstate-webhook: secret "openshift-nmstate-webhook" not found Mar 19 10:36:19 crc kubenswrapper[4765]: E0319 10:36:19.835135 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ac4e199b-7261-41f0-b9e9-51b167be05a7-tls-key-pair podName:ac4e199b-7261-41f0-b9e9-51b167be05a7 nodeName:}" failed. No retries permitted until 2026-03-19 10:36:20.335110223 +0000 UTC m=+878.684055765 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-key-pair" (UniqueName: "kubernetes.io/secret/ac4e199b-7261-41f0-b9e9-51b167be05a7-tls-key-pair") pod "nmstate-webhook-5f558f5558-rl6q7" (UID: "ac4e199b-7261-41f0-b9e9-51b167be05a7") : secret "openshift-nmstate-webhook" not found Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.835068 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4c4j\" (UniqueName: \"kubernetes.io/projected/76583484-f8aa-4a95-8450-206a93fb2b6c-kube-api-access-h4c4j\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.835494 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-ovs-socket\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.855342 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stw9v\" (UniqueName: \"kubernetes.io/projected/ac4e199b-7261-41f0-b9e9-51b167be05a7-kube-api-access-stw9v\") pod \"nmstate-webhook-5f558f5558-rl6q7\" (UID: \"ac4e199b-7261-41f0-b9e9-51b167be05a7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.855743 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6p2bg\" (UniqueName: \"kubernetes.io/projected/5a5acf7a-e38f-4ef1-9576-eae0d8e9a582-kube-api-access-6p2bg\") pod \"nmstate-metrics-9b8c8685d-z6jbc\" (UID: \"5a5acf7a-e38f-4ef1-9576-eae0d8e9a582\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.927237 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.940650 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-dbus-socket\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941265 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-dbus-socket\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941310 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76583484-f8aa-4a95-8450-206a93fb2b6c-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941353 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-nmstate-lock\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941428 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76583484-f8aa-4a95-8450-206a93fb2b6c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941474 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9zbg9\" (UniqueName: \"kubernetes.io/projected/e3d573a8-f56f-45e5-9905-5810e82af6ac-kube-api-access-9zbg9\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941522 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-nmstate-lock\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941532 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h4c4j\" (UniqueName: \"kubernetes.io/projected/76583484-f8aa-4a95-8450-206a93fb2b6c-kube-api-access-h4c4j\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: E0319 10:36:19.941628 4765 secret.go:188] Couldn't get secret openshift-nmstate/plugin-serving-cert: secret "plugin-serving-cert" not found Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941670 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-ovs-socket\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: E0319 10:36:19.941736 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/76583484-f8aa-4a95-8450-206a93fb2b6c-plugin-serving-cert podName:76583484-f8aa-4a95-8450-206a93fb2b6c nodeName:}" failed. No retries permitted until 2026-03-19 10:36:20.441708051 +0000 UTC m=+878.790653773 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/76583484-f8aa-4a95-8450-206a93fb2b6c-plugin-serving-cert") pod "nmstate-console-plugin-86f58fcf4-x2vbf" (UID: "76583484-f8aa-4a95-8450-206a93fb2b6c") : secret "plugin-serving-cert" not found Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.941841 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/e3d573a8-f56f-45e5-9905-5810e82af6ac-ovs-socket\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.943187 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/76583484-f8aa-4a95-8450-206a93fb2b6c-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.963263 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9zbg9\" (UniqueName: \"kubernetes.io/projected/e3d573a8-f56f-45e5-9905-5810e82af6ac-kube-api-access-9zbg9\") pod \"nmstate-handler-n2hqs\" (UID: \"e3d573a8-f56f-45e5-9905-5810e82af6ac\") " pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.966624 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h4c4j\" (UniqueName: \"kubernetes.io/projected/76583484-f8aa-4a95-8450-206a93fb2b6c-kube-api-access-h4c4j\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:19 crc kubenswrapper[4765]: I0319 10:36:19.989694 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.006679 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-8cb86847f-b5mcn"] Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.007481 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.042875 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-oauth-serving-cert\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.042982 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zfslq\" (UniqueName: \"kubernetes.io/projected/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-kube-api-access-zfslq\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.043035 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-serving-cert\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.043057 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-oauth-config\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.043132 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-trusted-ca-bundle\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.043188 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-config\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.043215 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-service-ca\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.071751 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8cb86847f-b5mcn"] Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.145599 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-config\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.145655 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-service-ca\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.145694 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-oauth-serving-cert\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.145732 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zfslq\" (UniqueName: \"kubernetes.io/projected/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-kube-api-access-zfslq\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.145766 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-serving-cert\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.145783 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-oauth-config\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.145801 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-trusted-ca-bundle\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.147945 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-config\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.148317 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-service-ca\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.148762 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-oauth-serving-cert\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.148761 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-trusted-ca-bundle\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.151628 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-oauth-config\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.158838 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-console-serving-cert\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.164246 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zfslq\" (UniqueName: \"kubernetes.io/projected/32e4b396-7a6d-4d59-ad4e-3cc29dd05620-kube-api-access-zfslq\") pod \"console-8cb86847f-b5mcn\" (UID: \"32e4b396-7a6d-4d59-ad4e-3cc29dd05620\") " pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.200183 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc"] Mar 19 10:36:20 crc kubenswrapper[4765]: W0319 10:36:20.208793 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a5acf7a_e38f_4ef1_9576_eae0d8e9a582.slice/crio-f20bc7a59e8869c8d96a209d4f67fa7fc4e072229caab85ff92dd84d90cca620 WatchSource:0}: Error finding container f20bc7a59e8869c8d96a209d4f67fa7fc4e072229caab85ff92dd84d90cca620: Status 404 returned error can't find the container with id f20bc7a59e8869c8d96a209d4f67fa7fc4e072229caab85ff92dd84d90cca620 Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.253454 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" event={"ID":"5a5acf7a-e38f-4ef1-9576-eae0d8e9a582","Type":"ContainerStarted","Data":"f20bc7a59e8869c8d96a209d4f67fa7fc4e072229caab85ff92dd84d90cca620"} Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.254880 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-n2hqs" event={"ID":"e3d573a8-f56f-45e5-9905-5810e82af6ac","Type":"ContainerStarted","Data":"76ca690046759497930a5ee29d46d487811d41246acca5f9d8b01bf139a29547"} Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.348687 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.349240 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ac4e199b-7261-41f0-b9e9-51b167be05a7-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rl6q7\" (UID: \"ac4e199b-7261-41f0-b9e9-51b167be05a7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.354934 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/ac4e199b-7261-41f0-b9e9-51b167be05a7-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-rl6q7\" (UID: \"ac4e199b-7261-41f0-b9e9-51b167be05a7\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.451905 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76583484-f8aa-4a95-8450-206a93fb2b6c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.459364 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/76583484-f8aa-4a95-8450-206a93fb2b6c-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-x2vbf\" (UID: \"76583484-f8aa-4a95-8450-206a93fb2b6c\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.543730 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.570622 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-8cb86847f-b5mcn"] Mar 19 10:36:20 crc kubenswrapper[4765]: W0319 10:36:20.574320 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod32e4b396_7a6d_4d59_ad4e_3cc29dd05620.slice/crio-88f72abe15a9598ad34f9dbf22df9546ef67acb917d55a3f192e0e7d825dba01 WatchSource:0}: Error finding container 88f72abe15a9598ad34f9dbf22df9546ef67acb917d55a3f192e0e7d825dba01: Status 404 returned error can't find the container with id 88f72abe15a9598ad34f9dbf22df9546ef67acb917d55a3f192e0e7d825dba01 Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.703531 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" Mar 19 10:36:20 crc kubenswrapper[4765]: I0319 10:36:20.759480 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7"] Mar 19 10:36:21 crc kubenswrapper[4765]: I0319 10:36:21.022105 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf"] Mar 19 10:36:21 crc kubenswrapper[4765]: I0319 10:36:21.266921 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" event={"ID":"76583484-f8aa-4a95-8450-206a93fb2b6c","Type":"ContainerStarted","Data":"e331b1ff4200a3e282e43fcb81fc66c0743c818d487e095c6ec86db39c3b0b76"} Mar 19 10:36:21 crc kubenswrapper[4765]: I0319 10:36:21.268261 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" event={"ID":"ac4e199b-7261-41f0-b9e9-51b167be05a7","Type":"ContainerStarted","Data":"ffd9d7a95da5c68f3e60340c29323404f3ea72e18b3c7991b94677f4a82969f5"} Mar 19 10:36:21 crc kubenswrapper[4765]: I0319 10:36:21.269815 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cb86847f-b5mcn" event={"ID":"32e4b396-7a6d-4d59-ad4e-3cc29dd05620","Type":"ContainerStarted","Data":"62bee130c429af15283d644df40638e8731055ed91197869c9d00c77ae0cb17a"} Mar 19 10:36:21 crc kubenswrapper[4765]: I0319 10:36:21.269846 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-8cb86847f-b5mcn" event={"ID":"32e4b396-7a6d-4d59-ad4e-3cc29dd05620","Type":"ContainerStarted","Data":"88f72abe15a9598ad34f9dbf22df9546ef67acb917d55a3f192e0e7d825dba01"} Mar 19 10:36:21 crc kubenswrapper[4765]: I0319 10:36:21.291629 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-8cb86847f-b5mcn" podStartSLOduration=2.291609366 podStartE2EDuration="2.291609366s" podCreationTimestamp="2026-03-19 10:36:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:36:21.287369419 +0000 UTC m=+879.636314981" watchObservedRunningTime="2026-03-19 10:36:21.291609366 +0000 UTC m=+879.640554908" Mar 19 10:36:22 crc kubenswrapper[4765]: I0319 10:36:22.046015 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:22 crc kubenswrapper[4765]: I0319 10:36:22.112573 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:22 crc kubenswrapper[4765]: I0319 10:36:22.284468 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brkb2"] Mar 19 10:36:23 crc kubenswrapper[4765]: I0319 10:36:23.293424 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" event={"ID":"ac4e199b-7261-41f0-b9e9-51b167be05a7","Type":"ContainerStarted","Data":"65697f3b164e6f17ce5ea088abd234e4fa59cd5bd09c6a4dd5b441ee99469ec0"} Mar 19 10:36:23 crc kubenswrapper[4765]: I0319 10:36:23.294525 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:23 crc kubenswrapper[4765]: I0319 10:36:23.302901 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" event={"ID":"5a5acf7a-e38f-4ef1-9576-eae0d8e9a582","Type":"ContainerStarted","Data":"02f7048c3cd023b871073b8dc956b8d8f614c082100cec0fc531a94c260ccdf3"} Mar 19 10:36:23 crc kubenswrapper[4765]: I0319 10:36:23.305014 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-n2hqs" event={"ID":"e3d573a8-f56f-45e5-9905-5810e82af6ac","Type":"ContainerStarted","Data":"afdb0b45d58c744f38d7e967b6070d27889bf18276c28a5ccb3104fd32502675"} Mar 19 10:36:23 crc kubenswrapper[4765]: I0319 10:36:23.305360 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-brkb2" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="registry-server" containerID="cri-o://a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c" gracePeriod=2 Mar 19 10:36:23 crc kubenswrapper[4765]: I0319 10:36:23.320451 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" podStartSLOduration=2.199716762 podStartE2EDuration="4.320400205s" podCreationTimestamp="2026-03-19 10:36:19 +0000 UTC" firstStartedPulling="2026-03-19 10:36:20.768507375 +0000 UTC m=+879.117452917" lastFinishedPulling="2026-03-19 10:36:22.889190818 +0000 UTC m=+881.238136360" observedRunningTime="2026-03-19 10:36:23.314174834 +0000 UTC m=+881.663120406" watchObservedRunningTime="2026-03-19 10:36:23.320400205 +0000 UTC m=+881.669345747" Mar 19 10:36:23 crc kubenswrapper[4765]: I0319 10:36:23.349009 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-n2hqs" podStartSLOduration=1.511570071 podStartE2EDuration="4.348975773s" podCreationTimestamp="2026-03-19 10:36:19 +0000 UTC" firstStartedPulling="2026-03-19 10:36:20.046169151 +0000 UTC m=+878.395114693" lastFinishedPulling="2026-03-19 10:36:22.883574853 +0000 UTC m=+881.232520395" observedRunningTime="2026-03-19 10:36:23.347878563 +0000 UTC m=+881.696824135" watchObservedRunningTime="2026-03-19 10:36:23.348975773 +0000 UTC m=+881.697921315" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.141841 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.244634 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-catalog-content\") pod \"c386e3c2-2465-4d85-a424-2ff9b5489178\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.245356 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-utilities\") pod \"c386e3c2-2465-4d85-a424-2ff9b5489178\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.245452 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzlvc\" (UniqueName: \"kubernetes.io/projected/c386e3c2-2465-4d85-a424-2ff9b5489178-kube-api-access-qzlvc\") pod \"c386e3c2-2465-4d85-a424-2ff9b5489178\" (UID: \"c386e3c2-2465-4d85-a424-2ff9b5489178\") " Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.246275 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-utilities" (OuterVolumeSpecName: "utilities") pod "c386e3c2-2465-4d85-a424-2ff9b5489178" (UID: "c386e3c2-2465-4d85-a424-2ff9b5489178"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.248763 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c386e3c2-2465-4d85-a424-2ff9b5489178-kube-api-access-qzlvc" (OuterVolumeSpecName: "kube-api-access-qzlvc") pod "c386e3c2-2465-4d85-a424-2ff9b5489178" (UID: "c386e3c2-2465-4d85-a424-2ff9b5489178"). InnerVolumeSpecName "kube-api-access-qzlvc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.313811 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" event={"ID":"76583484-f8aa-4a95-8450-206a93fb2b6c","Type":"ContainerStarted","Data":"ee28428642b2e2ba991ecba6e560d75fca0bf3a2d9d31a9da7e42ecc07b604e8"} Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.317459 4765 generic.go:334] "Generic (PLEG): container finished" podID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerID="a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c" exitCode=0 Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.317558 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brkb2" event={"ID":"c386e3c2-2465-4d85-a424-2ff9b5489178","Type":"ContainerDied","Data":"a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c"} Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.317601 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-brkb2" event={"ID":"c386e3c2-2465-4d85-a424-2ff9b5489178","Type":"ContainerDied","Data":"5fe4b0b8267bbebffa70bc0fa3beaa6e94284ef77290a6417aea3c4b00fbe585"} Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.317628 4765 scope.go:117] "RemoveContainer" containerID="a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.317862 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-brkb2" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.318090 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.335047 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-x2vbf" podStartSLOduration=2.415517022 podStartE2EDuration="5.335027017s" podCreationTimestamp="2026-03-19 10:36:19 +0000 UTC" firstStartedPulling="2026-03-19 10:36:21.030656862 +0000 UTC m=+879.379602404" lastFinishedPulling="2026-03-19 10:36:23.950166857 +0000 UTC m=+882.299112399" observedRunningTime="2026-03-19 10:36:24.33187995 +0000 UTC m=+882.680825502" watchObservedRunningTime="2026-03-19 10:36:24.335027017 +0000 UTC m=+882.683972549" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.344089 4765 scope.go:117] "RemoveContainer" containerID="dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.347709 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.347758 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzlvc\" (UniqueName: \"kubernetes.io/projected/c386e3c2-2465-4d85-a424-2ff9b5489178-kube-api-access-qzlvc\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.371812 4765 scope.go:117] "RemoveContainer" containerID="a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.394572 4765 scope.go:117] "RemoveContainer" containerID="a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c" Mar 19 10:36:24 crc kubenswrapper[4765]: E0319 10:36:24.395751 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c\": container with ID starting with a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c not found: ID does not exist" containerID="a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.395817 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c"} err="failed to get container status \"a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c\": rpc error: code = NotFound desc = could not find container \"a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c\": container with ID starting with a4144029f07348f46ad263eec6eca566d8befa63f1838707ecbbdc3288ab3d9c not found: ID does not exist" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.395852 4765 scope.go:117] "RemoveContainer" containerID="dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96" Mar 19 10:36:24 crc kubenswrapper[4765]: E0319 10:36:24.396360 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96\": container with ID starting with dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96 not found: ID does not exist" containerID="dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.396420 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96"} err="failed to get container status \"dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96\": rpc error: code = NotFound desc = could not find container \"dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96\": container with ID starting with dd9136f96f7101de9ed7059235cb6b53445f84f69c95d4624173a355f3444d96 not found: ID does not exist" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.396460 4765 scope.go:117] "RemoveContainer" containerID="a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852" Mar 19 10:36:24 crc kubenswrapper[4765]: E0319 10:36:24.397549 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852\": container with ID starting with a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852 not found: ID does not exist" containerID="a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.397848 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852"} err="failed to get container status \"a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852\": rpc error: code = NotFound desc = could not find container \"a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852\": container with ID starting with a3dc3904cfc5551a7d980d101230408be6e416badeff94b54794d45a8da9b852 not found: ID does not exist" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.401355 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c386e3c2-2465-4d85-a424-2ff9b5489178" (UID: "c386e3c2-2465-4d85-a424-2ff9b5489178"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.450920 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c386e3c2-2465-4d85-a424-2ff9b5489178-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.654495 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-brkb2"] Mar 19 10:36:24 crc kubenswrapper[4765]: I0319 10:36:24.657943 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-brkb2"] Mar 19 10:36:26 crc kubenswrapper[4765]: I0319 10:36:26.337290 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" event={"ID":"5a5acf7a-e38f-4ef1-9576-eae0d8e9a582","Type":"ContainerStarted","Data":"8e98c401af2ae753bf9b961e19bce49c7051d2ec4b1e135be610da18f61c0d43"} Mar 19 10:36:26 crc kubenswrapper[4765]: I0319 10:36:26.361578 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-z6jbc" podStartSLOduration=2.084114495 podStartE2EDuration="7.361547614s" podCreationTimestamp="2026-03-19 10:36:19 +0000 UTC" firstStartedPulling="2026-03-19 10:36:20.210891002 +0000 UTC m=+878.559836544" lastFinishedPulling="2026-03-19 10:36:25.488324121 +0000 UTC m=+883.837269663" observedRunningTime="2026-03-19 10:36:26.360497615 +0000 UTC m=+884.709443167" watchObservedRunningTime="2026-03-19 10:36:26.361547614 +0000 UTC m=+884.710493156" Mar 19 10:36:26 crc kubenswrapper[4765]: I0319 10:36:26.364201 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" path="/var/lib/kubelet/pods/c386e3c2-2465-4d85-a424-2ff9b5489178/volumes" Mar 19 10:36:30 crc kubenswrapper[4765]: I0319 10:36:30.017681 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-n2hqs" Mar 19 10:36:30 crc kubenswrapper[4765]: I0319 10:36:30.349731 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:30 crc kubenswrapper[4765]: I0319 10:36:30.349828 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:30 crc kubenswrapper[4765]: I0319 10:36:30.370375 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:30 crc kubenswrapper[4765]: I0319 10:36:30.377525 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-8cb86847f-b5mcn" Mar 19 10:36:30 crc kubenswrapper[4765]: I0319 10:36:30.457009 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-94dnk"] Mar 19 10:36:31 crc kubenswrapper[4765]: I0319 10:36:31.656328 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:36:31 crc kubenswrapper[4765]: I0319 10:36:31.658085 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:36:40 crc kubenswrapper[4765]: I0319 10:36:40.549747 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-rl6q7" Mar 19 10:36:42 crc kubenswrapper[4765]: I0319 10:36:42.922116 4765 scope.go:117] "RemoveContainer" containerID="9c2a9d1ed92b4f97e449d5a1f6e5e1be3bc702936ec8effeea98a03643002fc3" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.210457 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2"] Mar 19 10:36:53 crc kubenswrapper[4765]: E0319 10:36:53.211394 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="extract-content" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.211417 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="extract-content" Mar 19 10:36:53 crc kubenswrapper[4765]: E0319 10:36:53.211433 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="registry-server" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.211441 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="registry-server" Mar 19 10:36:53 crc kubenswrapper[4765]: E0319 10:36:53.211465 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="extract-utilities" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.211475 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="extract-utilities" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.211608 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c386e3c2-2465-4d85-a424-2ff9b5489178" containerName="registry-server" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.212541 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.215178 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.226303 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2"] Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.404205 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wv55f\" (UniqueName: \"kubernetes.io/projected/379c6607-c195-4779-83b7-bdc20f7cda09-kube-api-access-wv55f\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.404296 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.404915 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.506153 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.506233 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wv55f\" (UniqueName: \"kubernetes.io/projected/379c6607-c195-4779-83b7-bdc20f7cda09-kube-api-access-wv55f\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.506271 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.506643 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.506778 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.526623 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wv55f\" (UniqueName: \"kubernetes.io/projected/379c6607-c195-4779-83b7-bdc20f7cda09-kube-api-access-wv55f\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.529485 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:53 crc kubenswrapper[4765]: I0319 10:36:53.722191 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2"] Mar 19 10:36:54 crc kubenswrapper[4765]: I0319 10:36:54.564739 4765 generic.go:334] "Generic (PLEG): container finished" podID="379c6607-c195-4779-83b7-bdc20f7cda09" containerID="e297aee253549d837f77e06e8da8cc6b9b6ec931b3342590bea921df65aa167d" exitCode=0 Mar 19 10:36:54 crc kubenswrapper[4765]: I0319 10:36:54.564865 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" event={"ID":"379c6607-c195-4779-83b7-bdc20f7cda09","Type":"ContainerDied","Data":"e297aee253549d837f77e06e8da8cc6b9b6ec931b3342590bea921df65aa167d"} Mar 19 10:36:54 crc kubenswrapper[4765]: I0319 10:36:54.565280 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" event={"ID":"379c6607-c195-4779-83b7-bdc20f7cda09","Type":"ContainerStarted","Data":"977a985a9e7fbad4fed9c0c06cb425247e68e12c28dcecd43162de3e24660264"} Mar 19 10:36:55 crc kubenswrapper[4765]: I0319 10:36:55.505065 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-94dnk" podUID="39658af6-59cf-48c7-9015-2271021bd64e" containerName="console" containerID="cri-o://c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875" gracePeriod=15 Mar 19 10:36:55 crc kubenswrapper[4765]: I0319 10:36:55.862660 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-94dnk_39658af6-59cf-48c7-9015-2271021bd64e/console/0.log" Mar 19 10:36:55 crc kubenswrapper[4765]: I0319 10:36:55.862770 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.041871 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-trusted-ca-bundle\") pod \"39658af6-59cf-48c7-9015-2271021bd64e\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.042474 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wk4xl\" (UniqueName: \"kubernetes.io/projected/39658af6-59cf-48c7-9015-2271021bd64e-kube-api-access-wk4xl\") pod \"39658af6-59cf-48c7-9015-2271021bd64e\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.042535 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-console-config\") pod \"39658af6-59cf-48c7-9015-2271021bd64e\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.042602 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-oauth-config\") pod \"39658af6-59cf-48c7-9015-2271021bd64e\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.042676 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-oauth-serving-cert\") pod \"39658af6-59cf-48c7-9015-2271021bd64e\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.042696 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-service-ca\") pod \"39658af6-59cf-48c7-9015-2271021bd64e\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.042749 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-serving-cert\") pod \"39658af6-59cf-48c7-9015-2271021bd64e\" (UID: \"39658af6-59cf-48c7-9015-2271021bd64e\") " Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.044971 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-service-ca" (OuterVolumeSpecName: "service-ca") pod "39658af6-59cf-48c7-9015-2271021bd64e" (UID: "39658af6-59cf-48c7-9015-2271021bd64e"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.045233 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "39658af6-59cf-48c7-9015-2271021bd64e" (UID: "39658af6-59cf-48c7-9015-2271021bd64e"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.048330 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-console-config" (OuterVolumeSpecName: "console-config") pod "39658af6-59cf-48c7-9015-2271021bd64e" (UID: "39658af6-59cf-48c7-9015-2271021bd64e"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.049938 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "39658af6-59cf-48c7-9015-2271021bd64e" (UID: "39658af6-59cf-48c7-9015-2271021bd64e"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.064195 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "39658af6-59cf-48c7-9015-2271021bd64e" (UID: "39658af6-59cf-48c7-9015-2271021bd64e"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.064325 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39658af6-59cf-48c7-9015-2271021bd64e-kube-api-access-wk4xl" (OuterVolumeSpecName: "kube-api-access-wk4xl") pod "39658af6-59cf-48c7-9015-2271021bd64e" (UID: "39658af6-59cf-48c7-9015-2271021bd64e"). InnerVolumeSpecName "kube-api-access-wk4xl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.065116 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "39658af6-59cf-48c7-9015-2271021bd64e" (UID: "39658af6-59cf-48c7-9015-2271021bd64e"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.144807 4765 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.144859 4765 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-service-ca\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.144869 4765 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.144878 4765 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.144887 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wk4xl\" (UniqueName: \"kubernetes.io/projected/39658af6-59cf-48c7-9015-2271021bd64e-kube-api-access-wk4xl\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.144898 4765 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/39658af6-59cf-48c7-9015-2271021bd64e-console-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.144906 4765 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/39658af6-59cf-48c7-9015-2271021bd64e-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.580788 4765 generic.go:334] "Generic (PLEG): container finished" podID="379c6607-c195-4779-83b7-bdc20f7cda09" containerID="a8e6a6bf4fa52a051c280283bc94c6ebc3c56f924aea5b224bef077fd90d3fba" exitCode=0 Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.580901 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" event={"ID":"379c6607-c195-4779-83b7-bdc20f7cda09","Type":"ContainerDied","Data":"a8e6a6bf4fa52a051c280283bc94c6ebc3c56f924aea5b224bef077fd90d3fba"} Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.584637 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-94dnk_39658af6-59cf-48c7-9015-2271021bd64e/console/0.log" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.584725 4765 generic.go:334] "Generic (PLEG): container finished" podID="39658af6-59cf-48c7-9015-2271021bd64e" containerID="c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875" exitCode=2 Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.584775 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-94dnk" event={"ID":"39658af6-59cf-48c7-9015-2271021bd64e","Type":"ContainerDied","Data":"c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875"} Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.584794 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-94dnk" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.584813 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-94dnk" event={"ID":"39658af6-59cf-48c7-9015-2271021bd64e","Type":"ContainerDied","Data":"889033b7b867a2e8b89847a5fa8c64dadb2f84f573cc02fffe753dae402fd2d8"} Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.584850 4765 scope.go:117] "RemoveContainer" containerID="c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.611762 4765 scope.go:117] "RemoveContainer" containerID="c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875" Mar 19 10:36:56 crc kubenswrapper[4765]: E0319 10:36:56.612413 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875\": container with ID starting with c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875 not found: ID does not exist" containerID="c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.612476 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875"} err="failed to get container status \"c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875\": rpc error: code = NotFound desc = could not find container \"c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875\": container with ID starting with c472a4f28b6d608e9535fcdddaae326b95b00c26be8dc28ff396505d32df7875 not found: ID does not exist" Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.623284 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-94dnk"] Mar 19 10:36:56 crc kubenswrapper[4765]: I0319 10:36:56.630893 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-94dnk"] Mar 19 10:36:57 crc kubenswrapper[4765]: I0319 10:36:57.596040 4765 generic.go:334] "Generic (PLEG): container finished" podID="379c6607-c195-4779-83b7-bdc20f7cda09" containerID="3c98dc750f90988e5840b6f5ec2953ce10e54b4d57e8e456b88e551ce5b46111" exitCode=0 Mar 19 10:36:57 crc kubenswrapper[4765]: I0319 10:36:57.596066 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" event={"ID":"379c6607-c195-4779-83b7-bdc20f7cda09","Type":"ContainerDied","Data":"3c98dc750f90988e5840b6f5ec2953ce10e54b4d57e8e456b88e551ce5b46111"} Mar 19 10:36:58 crc kubenswrapper[4765]: I0319 10:36:58.372380 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39658af6-59cf-48c7-9015-2271021bd64e" path="/var/lib/kubelet/pods/39658af6-59cf-48c7-9015-2271021bd64e/volumes" Mar 19 10:36:58 crc kubenswrapper[4765]: I0319 10:36:58.891152 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.089017 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-util\") pod \"379c6607-c195-4779-83b7-bdc20f7cda09\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.089342 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wv55f\" (UniqueName: \"kubernetes.io/projected/379c6607-c195-4779-83b7-bdc20f7cda09-kube-api-access-wv55f\") pod \"379c6607-c195-4779-83b7-bdc20f7cda09\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.089371 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-bundle\") pod \"379c6607-c195-4779-83b7-bdc20f7cda09\" (UID: \"379c6607-c195-4779-83b7-bdc20f7cda09\") " Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.090762 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-bundle" (OuterVolumeSpecName: "bundle") pod "379c6607-c195-4779-83b7-bdc20f7cda09" (UID: "379c6607-c195-4779-83b7-bdc20f7cda09"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.098930 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/379c6607-c195-4779-83b7-bdc20f7cda09-kube-api-access-wv55f" (OuterVolumeSpecName: "kube-api-access-wv55f") pod "379c6607-c195-4779-83b7-bdc20f7cda09" (UID: "379c6607-c195-4779-83b7-bdc20f7cda09"). InnerVolumeSpecName "kube-api-access-wv55f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.104984 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-util" (OuterVolumeSpecName: "util") pod "379c6607-c195-4779-83b7-bdc20f7cda09" (UID: "379c6607-c195-4779-83b7-bdc20f7cda09"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.193946 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-util\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.194011 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wv55f\" (UniqueName: \"kubernetes.io/projected/379c6607-c195-4779-83b7-bdc20f7cda09-kube-api-access-wv55f\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.194027 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/379c6607-c195-4779-83b7-bdc20f7cda09-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.611554 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" event={"ID":"379c6607-c195-4779-83b7-bdc20f7cda09","Type":"ContainerDied","Data":"977a985a9e7fbad4fed9c0c06cb425247e68e12c28dcecd43162de3e24660264"} Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.611644 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="977a985a9e7fbad4fed9c0c06cb425247e68e12c28dcecd43162de3e24660264" Mar 19 10:36:59 crc kubenswrapper[4765]: I0319 10:36:59.611705 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2" Mar 19 10:37:01 crc kubenswrapper[4765]: I0319 10:37:01.656359 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:37:01 crc kubenswrapper[4765]: I0319 10:37:01.656806 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:37:01 crc kubenswrapper[4765]: I0319 10:37:01.656882 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:37:01 crc kubenswrapper[4765]: I0319 10:37:01.657737 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c126315de99fbe26aafdf378053a0eb9d09d2fb8e735089e0f39caceb743cb3e"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:37:01 crc kubenswrapper[4765]: I0319 10:37:01.657809 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://c126315de99fbe26aafdf378053a0eb9d09d2fb8e735089e0f39caceb743cb3e" gracePeriod=600 Mar 19 10:37:02 crc kubenswrapper[4765]: I0319 10:37:02.632160 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="c126315de99fbe26aafdf378053a0eb9d09d2fb8e735089e0f39caceb743cb3e" exitCode=0 Mar 19 10:37:02 crc kubenswrapper[4765]: I0319 10:37:02.632208 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"c126315de99fbe26aafdf378053a0eb9d09d2fb8e735089e0f39caceb743cb3e"} Mar 19 10:37:02 crc kubenswrapper[4765]: I0319 10:37:02.633076 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"74ef4f9e7cb23afbd3cc2c57d6c7b62007d3fc20daf2ec79338ae2ff820f9dfb"} Mar 19 10:37:02 crc kubenswrapper[4765]: I0319 10:37:02.633111 4765 scope.go:117] "RemoveContainer" containerID="52c282ccaa9b12441cda9329a58912d852bd314df6cce7dba63b49f9309b4b08" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.431863 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg"] Mar 19 10:37:09 crc kubenswrapper[4765]: E0319 10:37:09.434190 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39658af6-59cf-48c7-9015-2271021bd64e" containerName="console" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.434289 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="39658af6-59cf-48c7-9015-2271021bd64e" containerName="console" Mar 19 10:37:09 crc kubenswrapper[4765]: E0319 10:37:09.434374 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379c6607-c195-4779-83b7-bdc20f7cda09" containerName="pull" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.434435 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="379c6607-c195-4779-83b7-bdc20f7cda09" containerName="pull" Mar 19 10:37:09 crc kubenswrapper[4765]: E0319 10:37:09.434503 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379c6607-c195-4779-83b7-bdc20f7cda09" containerName="util" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.434572 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="379c6607-c195-4779-83b7-bdc20f7cda09" containerName="util" Mar 19 10:37:09 crc kubenswrapper[4765]: E0319 10:37:09.434655 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="379c6607-c195-4779-83b7-bdc20f7cda09" containerName="extract" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.434717 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="379c6607-c195-4779-83b7-bdc20f7cda09" containerName="extract" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.434917 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="379c6607-c195-4779-83b7-bdc20f7cda09" containerName="extract" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.435032 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="39658af6-59cf-48c7-9015-2271021bd64e" containerName="console" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.435777 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.438506 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-kvfpg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.439311 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.439501 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.440770 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.441681 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.468357 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg"] Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.579635 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc004464-2eb9-4b7d-addf-91b7b69e01b6-webhook-cert\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.579700 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kdnrg\" (UniqueName: \"kubernetes.io/projected/fc004464-2eb9-4b7d-addf-91b7b69e01b6-kube-api-access-kdnrg\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.579754 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc004464-2eb9-4b7d-addf-91b7b69e01b6-apiservice-cert\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.682109 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc004464-2eb9-4b7d-addf-91b7b69e01b6-apiservice-cert\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.682214 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc004464-2eb9-4b7d-addf-91b7b69e01b6-webhook-cert\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.682248 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kdnrg\" (UniqueName: \"kubernetes.io/projected/fc004464-2eb9-4b7d-addf-91b7b69e01b6-kube-api-access-kdnrg\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.689279 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/fc004464-2eb9-4b7d-addf-91b7b69e01b6-apiservice-cert\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.702560 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kdnrg\" (UniqueName: \"kubernetes.io/projected/fc004464-2eb9-4b7d-addf-91b7b69e01b6-kube-api-access-kdnrg\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.703282 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/fc004464-2eb9-4b7d-addf-91b7b69e01b6-webhook-cert\") pod \"metallb-operator-controller-manager-7b6fd59d6c-c8kfg\" (UID: \"fc004464-2eb9-4b7d-addf-91b7b69e01b6\") " pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.755866 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.762117 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj"] Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.763275 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.765501 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-2ndtc" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.765911 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.766144 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.780671 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj"] Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.884693 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3f409a1-a978-47fc-9907-fec4a720ae18-webhook-cert\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.885121 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3f409a1-a978-47fc-9907-fec4a720ae18-apiservice-cert\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.885192 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kkxq\" (UniqueName: \"kubernetes.io/projected/d3f409a1-a978-47fc-9907-fec4a720ae18-kube-api-access-8kkxq\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.986417 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8kkxq\" (UniqueName: \"kubernetes.io/projected/d3f409a1-a978-47fc-9907-fec4a720ae18-kube-api-access-8kkxq\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.987011 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3f409a1-a978-47fc-9907-fec4a720ae18-webhook-cert\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:09 crc kubenswrapper[4765]: I0319 10:37:09.987078 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3f409a1-a978-47fc-9907-fec4a720ae18-apiservice-cert\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.002015 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d3f409a1-a978-47fc-9907-fec4a720ae18-webhook-cert\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.005806 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kkxq\" (UniqueName: \"kubernetes.io/projected/d3f409a1-a978-47fc-9907-fec4a720ae18-kube-api-access-8kkxq\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.010232 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d3f409a1-a978-47fc-9907-fec4a720ae18-apiservice-cert\") pod \"metallb-operator-webhook-server-757bfbd67b-zzdvj\" (UID: \"d3f409a1-a978-47fc-9907-fec4a720ae18\") " pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.064495 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg"] Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.072698 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.129098 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.379212 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj"] Mar 19 10:37:10 crc kubenswrapper[4765]: W0319 10:37:10.387518 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd3f409a1_a978_47fc_9907_fec4a720ae18.slice/crio-f585334b1d3eefe1df21abe119a844395729ddf2ef6474a0eb199633151fb812 WatchSource:0}: Error finding container f585334b1d3eefe1df21abe119a844395729ddf2ef6474a0eb199633151fb812: Status 404 returned error can't find the container with id f585334b1d3eefe1df21abe119a844395729ddf2ef6474a0eb199633151fb812 Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.693639 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" event={"ID":"d3f409a1-a978-47fc-9907-fec4a720ae18","Type":"ContainerStarted","Data":"f585334b1d3eefe1df21abe119a844395729ddf2ef6474a0eb199633151fb812"} Mar 19 10:37:10 crc kubenswrapper[4765]: I0319 10:37:10.694933 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" event={"ID":"fc004464-2eb9-4b7d-addf-91b7b69e01b6","Type":"ContainerStarted","Data":"7251f72f3d05e50adeccf151791cd0e0e639bc422fe0a9bf37370e1c9b3f8fd7"} Mar 19 10:37:15 crc kubenswrapper[4765]: I0319 10:37:15.754703 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" event={"ID":"d3f409a1-a978-47fc-9907-fec4a720ae18","Type":"ContainerStarted","Data":"b912706155a9aee7c8b5221f2e98961be90940407e028e3304b063fed75c0c33"} Mar 19 10:37:15 crc kubenswrapper[4765]: I0319 10:37:15.755238 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:15 crc kubenswrapper[4765]: I0319 10:37:15.758448 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" event={"ID":"fc004464-2eb9-4b7d-addf-91b7b69e01b6","Type":"ContainerStarted","Data":"d1df6d80f8fcbdf157a33fd901e1bd4b2d48272ca083b3e2ae4fc8bc45aa1237"} Mar 19 10:37:15 crc kubenswrapper[4765]: I0319 10:37:15.759191 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:15 crc kubenswrapper[4765]: I0319 10:37:15.800036 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" podStartSLOduration=1.803165358 podStartE2EDuration="6.800011582s" podCreationTimestamp="2026-03-19 10:37:09 +0000 UTC" firstStartedPulling="2026-03-19 10:37:10.39178136 +0000 UTC m=+928.740726902" lastFinishedPulling="2026-03-19 10:37:15.388627594 +0000 UTC m=+933.737573126" observedRunningTime="2026-03-19 10:37:15.795158821 +0000 UTC m=+934.144104363" watchObservedRunningTime="2026-03-19 10:37:15.800011582 +0000 UTC m=+934.148957124" Mar 19 10:37:15 crc kubenswrapper[4765]: I0319 10:37:15.816232 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" podStartSLOduration=1.5238964830000001 podStartE2EDuration="6.816199671s" podCreationTimestamp="2026-03-19 10:37:09 +0000 UTC" firstStartedPulling="2026-03-19 10:37:10.072305225 +0000 UTC m=+928.421250767" lastFinishedPulling="2026-03-19 10:37:15.364608413 +0000 UTC m=+933.713553955" observedRunningTime="2026-03-19 10:37:15.814930177 +0000 UTC m=+934.163875719" watchObservedRunningTime="2026-03-19 10:37:15.816199671 +0000 UTC m=+934.165145213" Mar 19 10:37:30 crc kubenswrapper[4765]: I0319 10:37:30.136819 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-757bfbd67b-zzdvj" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.447597 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-h67wb"] Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.450079 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.469727 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h67wb"] Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.520152 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-utilities\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.520367 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-catalog-content\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.520519 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zndmg\" (UniqueName: \"kubernetes.io/projected/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-kube-api-access-zndmg\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.622182 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-utilities\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.622261 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-catalog-content\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.622292 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zndmg\" (UniqueName: \"kubernetes.io/projected/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-kube-api-access-zndmg\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.622790 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-utilities\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.622901 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-catalog-content\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.644283 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zndmg\" (UniqueName: \"kubernetes.io/projected/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-kube-api-access-zndmg\") pod \"certified-operators-h67wb\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:45 crc kubenswrapper[4765]: I0319 10:37:45.773367 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:46 crc kubenswrapper[4765]: I0319 10:37:46.100323 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-h67wb"] Mar 19 10:37:46 crc kubenswrapper[4765]: I0319 10:37:46.992745 4765 generic.go:334] "Generic (PLEG): container finished" podID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerID="be864bc17292d2c20ed2e175a44137688f9d2e6b268131a3e3724632fd2cbe6e" exitCode=0 Mar 19 10:37:46 crc kubenswrapper[4765]: I0319 10:37:46.992807 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h67wb" event={"ID":"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b","Type":"ContainerDied","Data":"be864bc17292d2c20ed2e175a44137688f9d2e6b268131a3e3724632fd2cbe6e"} Mar 19 10:37:46 crc kubenswrapper[4765]: I0319 10:37:46.993069 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h67wb" event={"ID":"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b","Type":"ContainerStarted","Data":"de381f30296fd32cabf0716f66779fc2adbc3b000dd7f1263ac9f0bbca5b5445"} Mar 19 10:37:48 crc kubenswrapper[4765]: I0319 10:37:48.002223 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h67wb" event={"ID":"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b","Type":"ContainerStarted","Data":"c5fe170cc138ccb697375821d28a029d310eb73208ee8331606de6bca19fa81d"} Mar 19 10:37:49 crc kubenswrapper[4765]: I0319 10:37:49.009914 4765 generic.go:334] "Generic (PLEG): container finished" podID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerID="c5fe170cc138ccb697375821d28a029d310eb73208ee8331606de6bca19fa81d" exitCode=0 Mar 19 10:37:49 crc kubenswrapper[4765]: I0319 10:37:49.010203 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h67wb" event={"ID":"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b","Type":"ContainerDied","Data":"c5fe170cc138ccb697375821d28a029d310eb73208ee8331606de6bca19fa81d"} Mar 19 10:37:49 crc kubenswrapper[4765]: I0319 10:37:49.759342 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-7b6fd59d6c-c8kfg" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.019833 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h67wb" event={"ID":"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b","Type":"ContainerStarted","Data":"113e0062dc5c3fe1caaa159781171825d5986d4829452e749c81beb693629eb8"} Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.427676 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-h67wb" podStartSLOduration=2.718393372 podStartE2EDuration="5.427657382s" podCreationTimestamp="2026-03-19 10:37:45 +0000 UTC" firstStartedPulling="2026-03-19 10:37:46.995372442 +0000 UTC m=+965.344317984" lastFinishedPulling="2026-03-19 10:37:49.704636442 +0000 UTC m=+968.053581994" observedRunningTime="2026-03-19 10:37:50.039839144 +0000 UTC m=+968.388784696" watchObservedRunningTime="2026-03-19 10:37:50.427657382 +0000 UTC m=+968.776602944" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.430363 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-w58kf"] Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.434176 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.436369 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.436767 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-z7g6c" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.439114 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.441241 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw"] Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.446452 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.453805 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.463872 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw"] Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515322 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics-certs\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515387 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-conf\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515421 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s5nrd\" (UniqueName: \"kubernetes.io/projected/bff02354-3273-4396-b996-06a749a9692f-kube-api-access-s5nrd\") pod \"frr-k8s-webhook-server-bcc4b6f68-f6fhw\" (UID: \"bff02354-3273-4396-b996-06a749a9692f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515454 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-startup\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515472 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515550 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-reloader\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515574 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-sockets\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515596 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp42s\" (UniqueName: \"kubernetes.io/projected/0f2f40b4-6884-47cf-9845-7a45001ceda5-kube-api-access-fp42s\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.515616 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bff02354-3273-4396-b996-06a749a9692f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-f6fhw\" (UID: \"bff02354-3273-4396-b996-06a749a9692f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.547826 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-czcsr"] Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.548891 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.551563 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.552822 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.552847 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.553103 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-k9k5t" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.562625 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-2rmkf"] Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.563629 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.570106 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.581153 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-2rmkf"] Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.616767 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-reloader\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.616849 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-sockets\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.616891 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp42s\" (UniqueName: \"kubernetes.io/projected/0f2f40b4-6884-47cf-9845-7a45001ceda5-kube-api-access-fp42s\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.616920 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bff02354-3273-4396-b996-06a749a9692f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-f6fhw\" (UID: \"bff02354-3273-4396-b996-06a749a9692f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.616986 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617014 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics-certs\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617052 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metallb-excludel2\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617094 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-conf\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617139 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s5nrd\" (UniqueName: \"kubernetes.io/projected/bff02354-3273-4396-b996-06a749a9692f-kube-api-access-s5nrd\") pod \"frr-k8s-webhook-server-bcc4b6f68-f6fhw\" (UID: \"bff02354-3273-4396-b996-06a749a9692f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: E0319 10:37:50.617232 4765 secret.go:188] Couldn't get secret metallb-system/frr-k8s-certs-secret: secret "frr-k8s-certs-secret" not found Mar 19 10:37:50 crc kubenswrapper[4765]: E0319 10:37:50.617332 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics-certs podName:0f2f40b4-6884-47cf-9845-7a45001ceda5 nodeName:}" failed. No retries permitted until 2026-03-19 10:37:51.117306946 +0000 UTC m=+969.466252698 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics-certs") pod "frr-k8s-w58kf" (UID: "0f2f40b4-6884-47cf-9845-7a45001ceda5") : secret "frr-k8s-certs-secret" not found Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617432 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-startup\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617555 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metrics-certs\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.617599 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fm4lm\" (UniqueName: \"kubernetes.io/projected/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-kube-api-access-fm4lm\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.618521 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-startup\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.618793 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-reloader\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.621291 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-sockets\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.623163 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-frr-conf\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.623210 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.626303 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/bff02354-3273-4396-b996-06a749a9692f-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-f6fhw\" (UID: \"bff02354-3273-4396-b996-06a749a9692f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.636695 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp42s\" (UniqueName: \"kubernetes.io/projected/0f2f40b4-6884-47cf-9845-7a45001ceda5-kube-api-access-fp42s\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.639746 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s5nrd\" (UniqueName: \"kubernetes.io/projected/bff02354-3273-4396-b996-06a749a9692f-kube-api-access-s5nrd\") pod \"frr-k8s-webhook-server-bcc4b6f68-f6fhw\" (UID: \"bff02354-3273-4396-b996-06a749a9692f\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.719523 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metallb-excludel2\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.719611 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7hrh\" (UniqueName: \"kubernetes.io/projected/0f10db70-5575-427a-b0de-f36a4c0a5feb-kube-api-access-w7hrh\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.719639 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f10db70-5575-427a-b0de-f36a4c0a5feb-cert\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.719684 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fm4lm\" (UniqueName: \"kubernetes.io/projected/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-kube-api-access-fm4lm\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.719709 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metrics-certs\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.719763 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f10db70-5575-427a-b0de-f36a4c0a5feb-metrics-certs\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.719805 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: E0319 10:37:50.719872 4765 secret.go:188] Couldn't get secret metallb-system/speaker-certs-secret: secret "speaker-certs-secret" not found Mar 19 10:37:50 crc kubenswrapper[4765]: E0319 10:37:50.719897 4765 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 10:37:50 crc kubenswrapper[4765]: E0319 10:37:50.719925 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metrics-certs podName:afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71 nodeName:}" failed. No retries permitted until 2026-03-19 10:37:51.219910819 +0000 UTC m=+969.568856361 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metrics-certs") pod "speaker-czcsr" (UID: "afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71") : secret "speaker-certs-secret" not found Mar 19 10:37:50 crc kubenswrapper[4765]: E0319 10:37:50.719940 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist podName:afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71 nodeName:}" failed. No retries permitted until 2026-03-19 10:37:51.219932739 +0000 UTC m=+969.568878281 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist") pod "speaker-czcsr" (UID: "afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71") : secret "metallb-memberlist" not found Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.720592 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metallb-excludel2\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.737071 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fm4lm\" (UniqueName: \"kubernetes.io/projected/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-kube-api-access-fm4lm\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.763100 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.820640 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f10db70-5575-427a-b0de-f36a4c0a5feb-cert\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.820689 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7hrh\" (UniqueName: \"kubernetes.io/projected/0f10db70-5575-427a-b0de-f36a4c0a5feb-kube-api-access-w7hrh\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.820766 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f10db70-5575-427a-b0de-f36a4c0a5feb-metrics-certs\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.827299 4765 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.827907 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f10db70-5575-427a-b0de-f36a4c0a5feb-metrics-certs\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.835812 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0f10db70-5575-427a-b0de-f36a4c0a5feb-cert\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.850651 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7hrh\" (UniqueName: \"kubernetes.io/projected/0f10db70-5575-427a-b0de-f36a4c0a5feb-kube-api-access-w7hrh\") pod \"controller-7bb4cc7c98-2rmkf\" (UID: \"0f10db70-5575-427a-b0de-f36a4c0a5feb\") " pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:50 crc kubenswrapper[4765]: I0319 10:37:50.877599 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.129307 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics-certs\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.134578 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/0f2f40b4-6884-47cf-9845-7a45001ceda5-metrics-certs\") pod \"frr-k8s-w58kf\" (UID: \"0f2f40b4-6884-47cf-9845-7a45001ceda5\") " pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.159151 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-2rmkf"] Mar 19 10:37:51 crc kubenswrapper[4765]: W0319 10:37:51.163267 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f10db70_5575_427a_b0de_f36a4c0a5feb.slice/crio-f2725e2744c482c5b52d07b5eea1e4e80a132c8df78156ca33e1e974ab5cb640 WatchSource:0}: Error finding container f2725e2744c482c5b52d07b5eea1e4e80a132c8df78156ca33e1e974ab5cb640: Status 404 returned error can't find the container with id f2725e2744c482c5b52d07b5eea1e4e80a132c8df78156ca33e1e974ab5cb640 Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.232829 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metrics-certs\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.232930 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:51 crc kubenswrapper[4765]: E0319 10:37:51.233166 4765 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 19 10:37:51 crc kubenswrapper[4765]: E0319 10:37:51.233275 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist podName:afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71 nodeName:}" failed. No retries permitted until 2026-03-19 10:37:52.233251952 +0000 UTC m=+970.582197494 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist") pod "speaker-czcsr" (UID: "afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71") : secret "metallb-memberlist" not found Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.241256 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-metrics-certs\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.290947 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw"] Mar 19 10:37:51 crc kubenswrapper[4765]: I0319 10:37:51.350434 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-w58kf" Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.035217 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" event={"ID":"bff02354-3273-4396-b996-06a749a9692f","Type":"ContainerStarted","Data":"138c500ed701a961b0716c35f0d2e1ad19b10fd40b8709e5e90f2a011244e3a4"} Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.038664 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2rmkf" event={"ID":"0f10db70-5575-427a-b0de-f36a4c0a5feb","Type":"ContainerStarted","Data":"e81bbf596784b6cbb529e591d2a14c768dfeffcd8dfe947bd8ff462a3a69c997"} Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.038826 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2rmkf" event={"ID":"0f10db70-5575-427a-b0de-f36a4c0a5feb","Type":"ContainerStarted","Data":"5f1c8e915c544584ff0544fa6b6378f98d4109c2ea7c3ac3d7e29d92f33a1e57"} Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.038843 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2rmkf" event={"ID":"0f10db70-5575-427a-b0de-f36a4c0a5feb","Type":"ContainerStarted","Data":"f2725e2744c482c5b52d07b5eea1e4e80a132c8df78156ca33e1e974ab5cb640"} Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.039020 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.039538 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerStarted","Data":"0133b48f5930a4736b3b13df308f40f13bc3670de589bb29651e743e15b56561"} Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.056386 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-2rmkf" podStartSLOduration=2.056358176 podStartE2EDuration="2.056358176s" podCreationTimestamp="2026-03-19 10:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:37:52.05614345 +0000 UTC m=+970.405089002" watchObservedRunningTime="2026-03-19 10:37:52.056358176 +0000 UTC m=+970.405303718" Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.248304 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.252675 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71-memberlist\") pod \"speaker-czcsr\" (UID: \"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71\") " pod="metallb-system/speaker-czcsr" Mar 19 10:37:52 crc kubenswrapper[4765]: I0319 10:37:52.365207 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-czcsr" Mar 19 10:37:53 crc kubenswrapper[4765]: I0319 10:37:53.051818 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-czcsr" event={"ID":"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71","Type":"ContainerStarted","Data":"b61091e5bcdacbd34f314ec8c05d58d764c3be785f176f888efa43cc8f8f2801"} Mar 19 10:37:53 crc kubenswrapper[4765]: I0319 10:37:53.051875 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-czcsr" event={"ID":"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71","Type":"ContainerStarted","Data":"758b102c65118cdd8bb74ba4a57a6202087301f72c8e5a29bd753b737c581f8c"} Mar 19 10:37:53 crc kubenswrapper[4765]: I0319 10:37:53.051887 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-czcsr" event={"ID":"afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71","Type":"ContainerStarted","Data":"3c07c3677c1a9f0e68001344e769b0b597f848431f9c92ef878668e4384dc3c3"} Mar 19 10:37:53 crc kubenswrapper[4765]: I0319 10:37:53.052271 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-czcsr" Mar 19 10:37:53 crc kubenswrapper[4765]: I0319 10:37:53.081245 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-czcsr" podStartSLOduration=3.081221662 podStartE2EDuration="3.081221662s" podCreationTimestamp="2026-03-19 10:37:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:37:53.075360253 +0000 UTC m=+971.424305815" watchObservedRunningTime="2026-03-19 10:37:53.081221662 +0000 UTC m=+971.430167224" Mar 19 10:37:55 crc kubenswrapper[4765]: I0319 10:37:55.774365 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:55 crc kubenswrapper[4765]: I0319 10:37:55.774896 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:55 crc kubenswrapper[4765]: I0319 10:37:55.824436 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:56 crc kubenswrapper[4765]: I0319 10:37:56.139298 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:56 crc kubenswrapper[4765]: I0319 10:37:56.191772 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h67wb"] Mar 19 10:37:58 crc kubenswrapper[4765]: I0319 10:37:58.098150 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-h67wb" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="registry-server" containerID="cri-o://113e0062dc5c3fe1caaa159781171825d5986d4829452e749c81beb693629eb8" gracePeriod=2 Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.107727 4765 generic.go:334] "Generic (PLEG): container finished" podID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerID="113e0062dc5c3fe1caaa159781171825d5986d4829452e749c81beb693629eb8" exitCode=0 Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.107770 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h67wb" event={"ID":"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b","Type":"ContainerDied","Data":"113e0062dc5c3fe1caaa159781171825d5986d4829452e749c81beb693629eb8"} Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.447630 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.591231 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-utilities\") pod \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.591360 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zndmg\" (UniqueName: \"kubernetes.io/projected/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-kube-api-access-zndmg\") pod \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.591429 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-catalog-content\") pod \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\" (UID: \"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b\") " Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.592077 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-utilities" (OuterVolumeSpecName: "utilities") pod "3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" (UID: "3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.598387 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-kube-api-access-zndmg" (OuterVolumeSpecName: "kube-api-access-zndmg") pod "3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" (UID: "3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b"). InnerVolumeSpecName "kube-api-access-zndmg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.638277 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" (UID: "3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.692505 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zndmg\" (UniqueName: \"kubernetes.io/projected/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-kube-api-access-zndmg\") on node \"crc\" DevicePath \"\"" Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.692542 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:37:59 crc kubenswrapper[4765]: I0319 10:37:59.692551 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.116064 4765 generic.go:334] "Generic (PLEG): container finished" podID="0f2f40b4-6884-47cf-9845-7a45001ceda5" containerID="33f1b2c7c82ff6705fa1efe57123258dacead79b8c4e118eafa9f9e49ba04570" exitCode=0 Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.117418 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerDied","Data":"33f1b2c7c82ff6705fa1efe57123258dacead79b8c4e118eafa9f9e49ba04570"} Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.120604 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-h67wb" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.120571 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-h67wb" event={"ID":"3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b","Type":"ContainerDied","Data":"de381f30296fd32cabf0716f66779fc2adbc3b000dd7f1263ac9f0bbca5b5445"} Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.121013 4765 scope.go:117] "RemoveContainer" containerID="113e0062dc5c3fe1caaa159781171825d5986d4829452e749c81beb693629eb8" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.126350 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" event={"ID":"bff02354-3273-4396-b996-06a749a9692f","Type":"ContainerStarted","Data":"3978e622c69c7a8f0418d046e75f6c7a907e094ada3ea539d42ecb267e9e1d5b"} Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.126929 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.131707 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565278-cl6lh"] Mar 19 10:38:00 crc kubenswrapper[4765]: E0319 10:38:00.132183 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="extract-content" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.132227 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="extract-content" Mar 19 10:38:00 crc kubenswrapper[4765]: E0319 10:38:00.132245 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="extract-utilities" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.132254 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="extract-utilities" Mar 19 10:38:00 crc kubenswrapper[4765]: E0319 10:38:00.132263 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="registry-server" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.132270 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="registry-server" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.132492 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" containerName="registry-server" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.134235 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-cl6lh" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.140915 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.141039 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.140942 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.145505 4765 scope.go:117] "RemoveContainer" containerID="c5fe170cc138ccb697375821d28a029d310eb73208ee8331606de6bca19fa81d" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.146613 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-cl6lh"] Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.186173 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-h67wb"] Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.190187 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-h67wb"] Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.216734 4765 scope.go:117] "RemoveContainer" containerID="be864bc17292d2c20ed2e175a44137688f9d2e6b268131a3e3724632fd2cbe6e" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.303669 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw6h7\" (UniqueName: \"kubernetes.io/projected/141a4fdb-54ad-466b-90d6-b9209e18b1a7-kube-api-access-vw6h7\") pod \"auto-csr-approver-29565278-cl6lh\" (UID: \"141a4fdb-54ad-466b-90d6-b9209e18b1a7\") " pod="openshift-infra/auto-csr-approver-29565278-cl6lh" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.371853 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b" path="/var/lib/kubelet/pods/3ed1c2ae-abbc-4c1f-8b99-8abf47703f1b/volumes" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.405069 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vw6h7\" (UniqueName: \"kubernetes.io/projected/141a4fdb-54ad-466b-90d6-b9209e18b1a7-kube-api-access-vw6h7\") pod \"auto-csr-approver-29565278-cl6lh\" (UID: \"141a4fdb-54ad-466b-90d6-b9209e18b1a7\") " pod="openshift-infra/auto-csr-approver-29565278-cl6lh" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.425038 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw6h7\" (UniqueName: \"kubernetes.io/projected/141a4fdb-54ad-466b-90d6-b9209e18b1a7-kube-api-access-vw6h7\") pod \"auto-csr-approver-29565278-cl6lh\" (UID: \"141a4fdb-54ad-466b-90d6-b9209e18b1a7\") " pod="openshift-infra/auto-csr-approver-29565278-cl6lh" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.476164 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-cl6lh" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.702896 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" podStartSLOduration=2.760471312 podStartE2EDuration="10.702875466s" podCreationTimestamp="2026-03-19 10:37:50 +0000 UTC" firstStartedPulling="2026-03-19 10:37:51.29881574 +0000 UTC m=+969.647761282" lastFinishedPulling="2026-03-19 10:37:59.241219894 +0000 UTC m=+977.590165436" observedRunningTime="2026-03-19 10:38:00.207137041 +0000 UTC m=+978.556082613" watchObservedRunningTime="2026-03-19 10:38:00.702875466 +0000 UTC m=+979.051821008" Mar 19 10:38:00 crc kubenswrapper[4765]: I0319 10:38:00.705298 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-cl6lh"] Mar 19 10:38:00 crc kubenswrapper[4765]: W0319 10:38:00.707288 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod141a4fdb_54ad_466b_90d6_b9209e18b1a7.slice/crio-758ac765bc8c42bfa97817fc8ea861778d164f49c629363edcf3fc9c480edb99 WatchSource:0}: Error finding container 758ac765bc8c42bfa97817fc8ea861778d164f49c629363edcf3fc9c480edb99: Status 404 returned error can't find the container with id 758ac765bc8c42bfa97817fc8ea861778d164f49c629363edcf3fc9c480edb99 Mar 19 10:38:01 crc kubenswrapper[4765]: I0319 10:38:01.137389 4765 generic.go:334] "Generic (PLEG): container finished" podID="0f2f40b4-6884-47cf-9845-7a45001ceda5" containerID="d99314389bdc07c4b588126292d05ed9d51d27653a6ecfc7679b7671d3e81a8f" exitCode=0 Mar 19 10:38:01 crc kubenswrapper[4765]: I0319 10:38:01.137536 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerDied","Data":"d99314389bdc07c4b588126292d05ed9d51d27653a6ecfc7679b7671d3e81a8f"} Mar 19 10:38:01 crc kubenswrapper[4765]: I0319 10:38:01.140526 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565278-cl6lh" event={"ID":"141a4fdb-54ad-466b-90d6-b9209e18b1a7","Type":"ContainerStarted","Data":"758ac765bc8c42bfa97817fc8ea861778d164f49c629363edcf3fc9c480edb99"} Mar 19 10:38:02 crc kubenswrapper[4765]: I0319 10:38:02.151262 4765 generic.go:334] "Generic (PLEG): container finished" podID="0f2f40b4-6884-47cf-9845-7a45001ceda5" containerID="20a5c7a05336cf5a0b4b185f3894f6fca9fedcbd9bed627a50240b4b8bccd40e" exitCode=0 Mar 19 10:38:02 crc kubenswrapper[4765]: I0319 10:38:02.151325 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerDied","Data":"20a5c7a05336cf5a0b4b185f3894f6fca9fedcbd9bed627a50240b4b8bccd40e"} Mar 19 10:38:02 crc kubenswrapper[4765]: I0319 10:38:02.154192 4765 generic.go:334] "Generic (PLEG): container finished" podID="141a4fdb-54ad-466b-90d6-b9209e18b1a7" containerID="b2647d1bbe6b3a8341f4c6c0546cb3540a72d35faa02bcf3f0607dd968fb223b" exitCode=0 Mar 19 10:38:02 crc kubenswrapper[4765]: I0319 10:38:02.154227 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565278-cl6lh" event={"ID":"141a4fdb-54ad-466b-90d6-b9209e18b1a7","Type":"ContainerDied","Data":"b2647d1bbe6b3a8341f4c6c0546cb3540a72d35faa02bcf3f0607dd968fb223b"} Mar 19 10:38:02 crc kubenswrapper[4765]: I0319 10:38:02.371319 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-czcsr" Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.172577 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerStarted","Data":"2885a8430d204966c7415ad09b32fcd25a81439b91e6a6185a79eb7ce82cda71"} Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.172937 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerStarted","Data":"6fd8256cb0db2e4c40c5948ba9fbbf0daa25cc16bbaaf3a943fe164faa77cc7e"} Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.172952 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerStarted","Data":"1da3182030e029d5eec95c74f76dc73839352d849f114f4928909c98ae4a156e"} Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.172984 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerStarted","Data":"75e5a73de7ce16d3a26d960b87d217178a93f1b6ea07b4bf827b30732e82e692"} Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.172999 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerStarted","Data":"a700061e4206aac94f07b98eeeea502963a7d42b17857dfe35a2a754ef56636e"} Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.423789 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-cl6lh" Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.552273 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vw6h7\" (UniqueName: \"kubernetes.io/projected/141a4fdb-54ad-466b-90d6-b9209e18b1a7-kube-api-access-vw6h7\") pod \"141a4fdb-54ad-466b-90d6-b9209e18b1a7\" (UID: \"141a4fdb-54ad-466b-90d6-b9209e18b1a7\") " Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.562075 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/141a4fdb-54ad-466b-90d6-b9209e18b1a7-kube-api-access-vw6h7" (OuterVolumeSpecName: "kube-api-access-vw6h7") pod "141a4fdb-54ad-466b-90d6-b9209e18b1a7" (UID: "141a4fdb-54ad-466b-90d6-b9209e18b1a7"). InnerVolumeSpecName "kube-api-access-vw6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:38:03 crc kubenswrapper[4765]: I0319 10:38:03.654438 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vw6h7\" (UniqueName: \"kubernetes.io/projected/141a4fdb-54ad-466b-90d6-b9209e18b1a7-kube-api-access-vw6h7\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.196888 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-w58kf" event={"ID":"0f2f40b4-6884-47cf-9845-7a45001ceda5","Type":"ContainerStarted","Data":"77e499a3483ec04fa2e9ab3108f46624742564ea3b32b7bf896e1c3f3391b8a7"} Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.197692 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-w58kf" Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.198767 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565278-cl6lh" event={"ID":"141a4fdb-54ad-466b-90d6-b9209e18b1a7","Type":"ContainerDied","Data":"758ac765bc8c42bfa97817fc8ea861778d164f49c629363edcf3fc9c480edb99"} Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.198792 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="758ac765bc8c42bfa97817fc8ea861778d164f49c629363edcf3fc9c480edb99" Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.198890 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565278-cl6lh" Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.232002 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-w58kf" podStartSLOduration=6.409656666 podStartE2EDuration="14.231975023s" podCreationTimestamp="2026-03-19 10:37:50 +0000 UTC" firstStartedPulling="2026-03-19 10:37:51.447444931 +0000 UTC m=+969.796390483" lastFinishedPulling="2026-03-19 10:37:59.269763298 +0000 UTC m=+977.618708840" observedRunningTime="2026-03-19 10:38:04.227900962 +0000 UTC m=+982.576846514" watchObservedRunningTime="2026-03-19 10:38:04.231975023 +0000 UTC m=+982.580920565" Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.490560 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-c8lkp"] Mar 19 10:38:04 crc kubenswrapper[4765]: I0319 10:38:04.496397 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565272-c8lkp"] Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.208275 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-dkxg9"] Mar 19 10:38:05 crc kubenswrapper[4765]: E0319 10:38:05.208556 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="141a4fdb-54ad-466b-90d6-b9209e18b1a7" containerName="oc" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.208568 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="141a4fdb-54ad-466b-90d6-b9209e18b1a7" containerName="oc" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.208729 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="141a4fdb-54ad-466b-90d6-b9209e18b1a7" containerName="oc" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.209242 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dkxg9" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.213347 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.215862 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-dnbxn" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.216157 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.241155 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dkxg9"] Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.378129 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mlwdh\" (UniqueName: \"kubernetes.io/projected/3f941f10-6189-4498-b098-e5d78adb509e-kube-api-access-mlwdh\") pod \"openstack-operator-index-dkxg9\" (UID: \"3f941f10-6189-4498-b098-e5d78adb509e\") " pod="openstack-operators/openstack-operator-index-dkxg9" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.480375 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mlwdh\" (UniqueName: \"kubernetes.io/projected/3f941f10-6189-4498-b098-e5d78adb509e-kube-api-access-mlwdh\") pod \"openstack-operator-index-dkxg9\" (UID: \"3f941f10-6189-4498-b098-e5d78adb509e\") " pod="openstack-operators/openstack-operator-index-dkxg9" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.504553 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mlwdh\" (UniqueName: \"kubernetes.io/projected/3f941f10-6189-4498-b098-e5d78adb509e-kube-api-access-mlwdh\") pod \"openstack-operator-index-dkxg9\" (UID: \"3f941f10-6189-4498-b098-e5d78adb509e\") " pod="openstack-operators/openstack-operator-index-dkxg9" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.529290 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dkxg9" Mar 19 10:38:05 crc kubenswrapper[4765]: I0319 10:38:05.780267 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-dkxg9"] Mar 19 10:38:06 crc kubenswrapper[4765]: I0319 10:38:06.215931 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dkxg9" event={"ID":"3f941f10-6189-4498-b098-e5d78adb509e","Type":"ContainerStarted","Data":"9df5e0c76ae6b361fe4c7d313a1b672b7c4c93bc2a35797cd0e40d83b53841ba"} Mar 19 10:38:06 crc kubenswrapper[4765]: I0319 10:38:06.351607 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-w58kf" Mar 19 10:38:06 crc kubenswrapper[4765]: I0319 10:38:06.373053 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24961073-0b01-4ce3-8929-e916408b3431" path="/var/lib/kubelet/pods/24961073-0b01-4ce3-8929-e916408b3431/volumes" Mar 19 10:38:06 crc kubenswrapper[4765]: I0319 10:38:06.398797 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-w58kf" Mar 19 10:38:08 crc kubenswrapper[4765]: I0319 10:38:08.574042 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dkxg9"] Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.186143 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-fcqz8"] Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.187510 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.195601 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fcqz8"] Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.244478 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dkxg9" event={"ID":"3f941f10-6189-4498-b098-e5d78adb509e","Type":"ContainerStarted","Data":"9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817"} Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.262636 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-dkxg9" podStartSLOduration=1.877562137 podStartE2EDuration="4.262603284s" podCreationTimestamp="2026-03-19 10:38:05 +0000 UTC" firstStartedPulling="2026-03-19 10:38:05.791823779 +0000 UTC m=+984.140769321" lastFinishedPulling="2026-03-19 10:38:08.176864926 +0000 UTC m=+986.525810468" observedRunningTime="2026-03-19 10:38:09.258863172 +0000 UTC m=+987.607808724" watchObservedRunningTime="2026-03-19 10:38:09.262603284 +0000 UTC m=+987.611548826" Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.349926 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9p26\" (UniqueName: \"kubernetes.io/projected/0836451e-5b5f-47bd-8722-283ab5d34a5c-kube-api-access-g9p26\") pod \"openstack-operator-index-fcqz8\" (UID: \"0836451e-5b5f-47bd-8722-283ab5d34a5c\") " pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.451376 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9p26\" (UniqueName: \"kubernetes.io/projected/0836451e-5b5f-47bd-8722-283ab5d34a5c-kube-api-access-g9p26\") pod \"openstack-operator-index-fcqz8\" (UID: \"0836451e-5b5f-47bd-8722-283ab5d34a5c\") " pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.473433 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9p26\" (UniqueName: \"kubernetes.io/projected/0836451e-5b5f-47bd-8722-283ab5d34a5c-kube-api-access-g9p26\") pod \"openstack-operator-index-fcqz8\" (UID: \"0836451e-5b5f-47bd-8722-283ab5d34a5c\") " pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.514599 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:09 crc kubenswrapper[4765]: I0319 10:38:09.923797 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-fcqz8"] Mar 19 10:38:09 crc kubenswrapper[4765]: W0319 10:38:09.931794 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0836451e_5b5f_47bd_8722_283ab5d34a5c.slice/crio-7f590961fc63d4d8388f62f4cb628ef5b812d54a36d1940198cdba8f7d226d1b WatchSource:0}: Error finding container 7f590961fc63d4d8388f62f4cb628ef5b812d54a36d1940198cdba8f7d226d1b: Status 404 returned error can't find the container with id 7f590961fc63d4d8388f62f4cb628ef5b812d54a36d1940198cdba8f7d226d1b Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.252207 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-dkxg9" podUID="3f941f10-6189-4498-b098-e5d78adb509e" containerName="registry-server" containerID="cri-o://9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817" gracePeriod=2 Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.252711 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fcqz8" event={"ID":"0836451e-5b5f-47bd-8722-283ab5d34a5c","Type":"ContainerStarted","Data":"04c5bf2b46451811efd980341a3081260a02ffb32c0474843c8ec9f4f5c25b25"} Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.252769 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-fcqz8" event={"ID":"0836451e-5b5f-47bd-8722-283ab5d34a5c","Type":"ContainerStarted","Data":"7f590961fc63d4d8388f62f4cb628ef5b812d54a36d1940198cdba8f7d226d1b"} Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.271682 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-fcqz8" podStartSLOduration=1.220605426 podStartE2EDuration="1.271665431s" podCreationTimestamp="2026-03-19 10:38:09 +0000 UTC" firstStartedPulling="2026-03-19 10:38:09.935458403 +0000 UTC m=+988.284403945" lastFinishedPulling="2026-03-19 10:38:09.986518408 +0000 UTC m=+988.335463950" observedRunningTime="2026-03-19 10:38:10.266755218 +0000 UTC m=+988.615700770" watchObservedRunningTime="2026-03-19 10:38:10.271665431 +0000 UTC m=+988.620610973" Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.592116 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dkxg9" Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.769378 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-f6fhw" Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.769549 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mlwdh\" (UniqueName: \"kubernetes.io/projected/3f941f10-6189-4498-b098-e5d78adb509e-kube-api-access-mlwdh\") pod \"3f941f10-6189-4498-b098-e5d78adb509e\" (UID: \"3f941f10-6189-4498-b098-e5d78adb509e\") " Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.776508 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3f941f10-6189-4498-b098-e5d78adb509e-kube-api-access-mlwdh" (OuterVolumeSpecName: "kube-api-access-mlwdh") pod "3f941f10-6189-4498-b098-e5d78adb509e" (UID: "3f941f10-6189-4498-b098-e5d78adb509e"). InnerVolumeSpecName "kube-api-access-mlwdh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.871274 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mlwdh\" (UniqueName: \"kubernetes.io/projected/3f941f10-6189-4498-b098-e5d78adb509e-kube-api-access-mlwdh\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:10 crc kubenswrapper[4765]: I0319 10:38:10.883672 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-2rmkf" Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.262808 4765 generic.go:334] "Generic (PLEG): container finished" podID="3f941f10-6189-4498-b098-e5d78adb509e" containerID="9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817" exitCode=0 Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.263045 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-dkxg9" Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.263084 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dkxg9" event={"ID":"3f941f10-6189-4498-b098-e5d78adb509e","Type":"ContainerDied","Data":"9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817"} Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.263221 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-dkxg9" event={"ID":"3f941f10-6189-4498-b098-e5d78adb509e","Type":"ContainerDied","Data":"9df5e0c76ae6b361fe4c7d313a1b672b7c4c93bc2a35797cd0e40d83b53841ba"} Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.263273 4765 scope.go:117] "RemoveContainer" containerID="9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817" Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.285688 4765 scope.go:117] "RemoveContainer" containerID="9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817" Mar 19 10:38:11 crc kubenswrapper[4765]: E0319 10:38:11.287768 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817\": container with ID starting with 9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817 not found: ID does not exist" containerID="9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817" Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.287850 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817"} err="failed to get container status \"9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817\": rpc error: code = NotFound desc = could not find container \"9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817\": container with ID starting with 9a9ab62979de7b07b7171ffdd3ab5fd0b2288f7b5ef94fcbb2f49827e6058817 not found: ID does not exist" Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.310715 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-dkxg9"] Mar 19 10:38:11 crc kubenswrapper[4765]: I0319 10:38:11.314429 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-dkxg9"] Mar 19 10:38:12 crc kubenswrapper[4765]: I0319 10:38:12.364282 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3f941f10-6189-4498-b098-e5d78adb509e" path="/var/lib/kubelet/pods/3f941f10-6189-4498-b098-e5d78adb509e/volumes" Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.789352 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-89gkq"] Mar 19 10:38:16 crc kubenswrapper[4765]: E0319 10:38:16.790255 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3f941f10-6189-4498-b098-e5d78adb509e" containerName="registry-server" Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.790268 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3f941f10-6189-4498-b098-e5d78adb509e" containerName="registry-server" Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.790401 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3f941f10-6189-4498-b098-e5d78adb509e" containerName="registry-server" Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.791256 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.797934 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89gkq"] Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.961855 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-utilities\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.962175 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p26hp\" (UniqueName: \"kubernetes.io/projected/c3eb272d-2bcb-461c-b8fc-a569462389c1-kube-api-access-p26hp\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:16 crc kubenswrapper[4765]: I0319 10:38:16.962277 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-catalog-content\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.063914 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-catalog-content\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.064023 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-utilities\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.064098 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p26hp\" (UniqueName: \"kubernetes.io/projected/c3eb272d-2bcb-461c-b8fc-a569462389c1-kube-api-access-p26hp\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.064419 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-catalog-content\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.064451 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-utilities\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.090637 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p26hp\" (UniqueName: \"kubernetes.io/projected/c3eb272d-2bcb-461c-b8fc-a569462389c1-kube-api-access-p26hp\") pod \"community-operators-89gkq\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.137216 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:17 crc kubenswrapper[4765]: I0319 10:38:17.621713 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-89gkq"] Mar 19 10:38:17 crc kubenswrapper[4765]: W0319 10:38:17.631107 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3eb272d_2bcb_461c_b8fc_a569462389c1.slice/crio-3811de3faa5588f9d539911a6762c07f4ce87797c6f12b8e39bec70f1ce4e944 WatchSource:0}: Error finding container 3811de3faa5588f9d539911a6762c07f4ce87797c6f12b8e39bec70f1ce4e944: Status 404 returned error can't find the container with id 3811de3faa5588f9d539911a6762c07f4ce87797c6f12b8e39bec70f1ce4e944 Mar 19 10:38:18 crc kubenswrapper[4765]: I0319 10:38:18.313180 4765 generic.go:334] "Generic (PLEG): container finished" podID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerID="366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a" exitCode=0 Mar 19 10:38:18 crc kubenswrapper[4765]: I0319 10:38:18.313365 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89gkq" event={"ID":"c3eb272d-2bcb-461c-b8fc-a569462389c1","Type":"ContainerDied","Data":"366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a"} Mar 19 10:38:18 crc kubenswrapper[4765]: I0319 10:38:18.313496 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89gkq" event={"ID":"c3eb272d-2bcb-461c-b8fc-a569462389c1","Type":"ContainerStarted","Data":"3811de3faa5588f9d539911a6762c07f4ce87797c6f12b8e39bec70f1ce4e944"} Mar 19 10:38:19 crc kubenswrapper[4765]: I0319 10:38:19.321614 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89gkq" event={"ID":"c3eb272d-2bcb-461c-b8fc-a569462389c1","Type":"ContainerStarted","Data":"cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8"} Mar 19 10:38:19 crc kubenswrapper[4765]: I0319 10:38:19.515424 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:19 crc kubenswrapper[4765]: I0319 10:38:19.515474 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:19 crc kubenswrapper[4765]: I0319 10:38:19.553759 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:20 crc kubenswrapper[4765]: I0319 10:38:20.331152 4765 generic.go:334] "Generic (PLEG): container finished" podID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerID="cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8" exitCode=0 Mar 19 10:38:20 crc kubenswrapper[4765]: I0319 10:38:20.331276 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89gkq" event={"ID":"c3eb272d-2bcb-461c-b8fc-a569462389c1","Type":"ContainerDied","Data":"cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8"} Mar 19 10:38:20 crc kubenswrapper[4765]: I0319 10:38:20.368410 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-fcqz8" Mar 19 10:38:21 crc kubenswrapper[4765]: I0319 10:38:21.341655 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89gkq" event={"ID":"c3eb272d-2bcb-461c-b8fc-a569462389c1","Type":"ContainerStarted","Data":"0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157"} Mar 19 10:38:21 crc kubenswrapper[4765]: I0319 10:38:21.356299 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-w58kf" Mar 19 10:38:21 crc kubenswrapper[4765]: I0319 10:38:21.368802 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-89gkq" podStartSLOduration=2.963416788 podStartE2EDuration="5.368769767s" podCreationTimestamp="2026-03-19 10:38:16 +0000 UTC" firstStartedPulling="2026-03-19 10:38:18.319414752 +0000 UTC m=+996.668360324" lastFinishedPulling="2026-03-19 10:38:20.724767761 +0000 UTC m=+999.073713303" observedRunningTime="2026-03-19 10:38:21.36519265 +0000 UTC m=+999.714138232" watchObservedRunningTime="2026-03-19 10:38:21.368769767 +0000 UTC m=+999.717715349" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.179640 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-275fr"] Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.183481 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.196637 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-275fr"] Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.296682 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6zlgb\" (UniqueName: \"kubernetes.io/projected/b5bb9882-1699-4616-88b3-986070d06298-kube-api-access-6zlgb\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.296848 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-catalog-content\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.296945 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-utilities\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.398785 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6zlgb\" (UniqueName: \"kubernetes.io/projected/b5bb9882-1699-4616-88b3-986070d06298-kube-api-access-6zlgb\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.398898 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-catalog-content\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.398998 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-utilities\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.399670 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-catalog-content\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.399792 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-utilities\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.419992 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6zlgb\" (UniqueName: \"kubernetes.io/projected/b5bb9882-1699-4616-88b3-986070d06298-kube-api-access-6zlgb\") pod \"redhat-marketplace-275fr\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.502403 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:26 crc kubenswrapper[4765]: I0319 10:38:26.773974 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-275fr"] Mar 19 10:38:26 crc kubenswrapper[4765]: W0319 10:38:26.779188 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5bb9882_1699_4616_88b3_986070d06298.slice/crio-edbba0208df11095870477bc00d08ae15271829c541bb877f2b5e306de21c838 WatchSource:0}: Error finding container edbba0208df11095870477bc00d08ae15271829c541bb877f2b5e306de21c838: Status 404 returned error can't find the container with id edbba0208df11095870477bc00d08ae15271829c541bb877f2b5e306de21c838 Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.138339 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.138396 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.181829 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.383633 4765 generic.go:334] "Generic (PLEG): container finished" podID="b5bb9882-1699-4616-88b3-986070d06298" containerID="be98697021637e1660b583011ff0716a03df813a35ade1eaf35533366e32d658" exitCode=0 Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.383729 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-275fr" event={"ID":"b5bb9882-1699-4616-88b3-986070d06298","Type":"ContainerDied","Data":"be98697021637e1660b583011ff0716a03df813a35ade1eaf35533366e32d658"} Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.384232 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-275fr" event={"ID":"b5bb9882-1699-4616-88b3-986070d06298","Type":"ContainerStarted","Data":"edbba0208df11095870477bc00d08ae15271829c541bb877f2b5e306de21c838"} Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.438567 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.458005 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck"] Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.459636 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.461890 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-64bc8" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.495365 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck"] Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.617644 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-bundle\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.617939 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-util\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.618053 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8jch\" (UniqueName: \"kubernetes.io/projected/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-kube-api-access-n8jch\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.719375 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-bundle\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.719827 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-util\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.719855 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n8jch\" (UniqueName: \"kubernetes.io/projected/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-kube-api-access-n8jch\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.720233 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-bundle\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.720303 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-util\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.748951 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8jch\" (UniqueName: \"kubernetes.io/projected/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-kube-api-access-n8jch\") pod \"3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:27 crc kubenswrapper[4765]: I0319 10:38:27.778071 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:28 crc kubenswrapper[4765]: I0319 10:38:28.190036 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck"] Mar 19 10:38:28 crc kubenswrapper[4765]: I0319 10:38:28.392012 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" event={"ID":"b340ebf3-8897-41c3-8a3e-733e4afc3fdf","Type":"ContainerStarted","Data":"6abd5fba5b3eb8fd8e6b3aa1351403474b8faf08eb903061da0487f03798c4b7"} Mar 19 10:38:28 crc kubenswrapper[4765]: I0319 10:38:28.392075 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" event={"ID":"b340ebf3-8897-41c3-8a3e-733e4afc3fdf","Type":"ContainerStarted","Data":"99d541dd72f2d6597907f7838942fe8b1ac862a13229f262434d156b44651f51"} Mar 19 10:38:29 crc kubenswrapper[4765]: I0319 10:38:29.405089 4765 generic.go:334] "Generic (PLEG): container finished" podID="b5bb9882-1699-4616-88b3-986070d06298" containerID="4c9031c95738c9c9340fa57401e2608b02ef11d128345068394699ced83bf4af" exitCode=0 Mar 19 10:38:29 crc kubenswrapper[4765]: I0319 10:38:29.405180 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-275fr" event={"ID":"b5bb9882-1699-4616-88b3-986070d06298","Type":"ContainerDied","Data":"4c9031c95738c9c9340fa57401e2608b02ef11d128345068394699ced83bf4af"} Mar 19 10:38:29 crc kubenswrapper[4765]: I0319 10:38:29.407490 4765 generic.go:334] "Generic (PLEG): container finished" podID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerID="6abd5fba5b3eb8fd8e6b3aa1351403474b8faf08eb903061da0487f03798c4b7" exitCode=0 Mar 19 10:38:29 crc kubenswrapper[4765]: I0319 10:38:29.407537 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" event={"ID":"b340ebf3-8897-41c3-8a3e-733e4afc3fdf","Type":"ContainerDied","Data":"6abd5fba5b3eb8fd8e6b3aa1351403474b8faf08eb903061da0487f03798c4b7"} Mar 19 10:38:30 crc kubenswrapper[4765]: I0319 10:38:30.422905 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-275fr" event={"ID":"b5bb9882-1699-4616-88b3-986070d06298","Type":"ContainerStarted","Data":"1493717ead4c1435c17d84ca8e9aabfd9b4b727d35b9ca5186fb63c2fb2dd2ea"} Mar 19 10:38:30 crc kubenswrapper[4765]: I0319 10:38:30.426020 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" event={"ID":"b340ebf3-8897-41c3-8a3e-733e4afc3fdf","Type":"ContainerStarted","Data":"3eef6d919e721b891968d89f59211f9691c43ba8b03fe6d3d15965bcbe98b4a0"} Mar 19 10:38:30 crc kubenswrapper[4765]: I0319 10:38:30.455204 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-275fr" podStartSLOduration=2.003606567 podStartE2EDuration="4.455184161s" podCreationTimestamp="2026-03-19 10:38:26 +0000 UTC" firstStartedPulling="2026-03-19 10:38:27.385645316 +0000 UTC m=+1005.734590858" lastFinishedPulling="2026-03-19 10:38:29.83722291 +0000 UTC m=+1008.186168452" observedRunningTime="2026-03-19 10:38:30.446267469 +0000 UTC m=+1008.795213011" watchObservedRunningTime="2026-03-19 10:38:30.455184161 +0000 UTC m=+1008.804129703" Mar 19 10:38:31 crc kubenswrapper[4765]: I0319 10:38:31.434653 4765 generic.go:334] "Generic (PLEG): container finished" podID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerID="3eef6d919e721b891968d89f59211f9691c43ba8b03fe6d3d15965bcbe98b4a0" exitCode=0 Mar 19 10:38:31 crc kubenswrapper[4765]: I0319 10:38:31.435765 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" event={"ID":"b340ebf3-8897-41c3-8a3e-733e4afc3fdf","Type":"ContainerDied","Data":"3eef6d919e721b891968d89f59211f9691c43ba8b03fe6d3d15965bcbe98b4a0"} Mar 19 10:38:31 crc kubenswrapper[4765]: I0319 10:38:31.974594 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89gkq"] Mar 19 10:38:31 crc kubenswrapper[4765]: I0319 10:38:31.975358 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-89gkq" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="registry-server" containerID="cri-o://0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157" gracePeriod=2 Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.367618 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.443606 4765 generic.go:334] "Generic (PLEG): container finished" podID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerID="0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157" exitCode=0 Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.443672 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-89gkq" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.443695 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89gkq" event={"ID":"c3eb272d-2bcb-461c-b8fc-a569462389c1","Type":"ContainerDied","Data":"0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157"} Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.443728 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-89gkq" event={"ID":"c3eb272d-2bcb-461c-b8fc-a569462389c1","Type":"ContainerDied","Data":"3811de3faa5588f9d539911a6762c07f4ce87797c6f12b8e39bec70f1ce4e944"} Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.443747 4765 scope.go:117] "RemoveContainer" containerID="0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.451789 4765 generic.go:334] "Generic (PLEG): container finished" podID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerID="29ca8763e95abfa97f4eccaa3c7aa7502eb28383b5adfa429eec34481365a82a" exitCode=0 Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.451854 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" event={"ID":"b340ebf3-8897-41c3-8a3e-733e4afc3fdf","Type":"ContainerDied","Data":"29ca8763e95abfa97f4eccaa3c7aa7502eb28383b5adfa429eec34481365a82a"} Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.464438 4765 scope.go:117] "RemoveContainer" containerID="cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.488086 4765 scope.go:117] "RemoveContainer" containerID="366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.494418 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p26hp\" (UniqueName: \"kubernetes.io/projected/c3eb272d-2bcb-461c-b8fc-a569462389c1-kube-api-access-p26hp\") pod \"c3eb272d-2bcb-461c-b8fc-a569462389c1\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.494520 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-utilities\") pod \"c3eb272d-2bcb-461c-b8fc-a569462389c1\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.494705 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-catalog-content\") pod \"c3eb272d-2bcb-461c-b8fc-a569462389c1\" (UID: \"c3eb272d-2bcb-461c-b8fc-a569462389c1\") " Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.495264 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-utilities" (OuterVolumeSpecName: "utilities") pod "c3eb272d-2bcb-461c-b8fc-a569462389c1" (UID: "c3eb272d-2bcb-461c-b8fc-a569462389c1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.513713 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3eb272d-2bcb-461c-b8fc-a569462389c1-kube-api-access-p26hp" (OuterVolumeSpecName: "kube-api-access-p26hp") pod "c3eb272d-2bcb-461c-b8fc-a569462389c1" (UID: "c3eb272d-2bcb-461c-b8fc-a569462389c1"). InnerVolumeSpecName "kube-api-access-p26hp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.513863 4765 scope.go:117] "RemoveContainer" containerID="0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157" Mar 19 10:38:32 crc kubenswrapper[4765]: E0319 10:38:32.514424 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157\": container with ID starting with 0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157 not found: ID does not exist" containerID="0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.514478 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157"} err="failed to get container status \"0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157\": rpc error: code = NotFound desc = could not find container \"0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157\": container with ID starting with 0b2da6318b7cc7c3f1c5eef12a28adbfaa850806456e23fd113b902c918d9157 not found: ID does not exist" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.514512 4765 scope.go:117] "RemoveContainer" containerID="cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8" Mar 19 10:38:32 crc kubenswrapper[4765]: E0319 10:38:32.514886 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8\": container with ID starting with cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8 not found: ID does not exist" containerID="cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.514921 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8"} err="failed to get container status \"cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8\": rpc error: code = NotFound desc = could not find container \"cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8\": container with ID starting with cdef827f2d2fb20d63dcccdeaa45e315300191cfe5f02d31822f5d94d9b049b8 not found: ID does not exist" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.514952 4765 scope.go:117] "RemoveContainer" containerID="366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a" Mar 19 10:38:32 crc kubenswrapper[4765]: E0319 10:38:32.516213 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a\": container with ID starting with 366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a not found: ID does not exist" containerID="366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.516246 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a"} err="failed to get container status \"366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a\": rpc error: code = NotFound desc = could not find container \"366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a\": container with ID starting with 366dbb8313aab3c5dc9158a8902c21ed19c7aaf71b3922668a73186c12abea6a not found: ID does not exist" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.574310 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c3eb272d-2bcb-461c-b8fc-a569462389c1" (UID: "c3eb272d-2bcb-461c-b8fc-a569462389c1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.597147 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.597180 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p26hp\" (UniqueName: \"kubernetes.io/projected/c3eb272d-2bcb-461c-b8fc-a569462389c1-kube-api-access-p26hp\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.597192 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c3eb272d-2bcb-461c-b8fc-a569462389c1-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.775142 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-89gkq"] Mar 19 10:38:32 crc kubenswrapper[4765]: I0319 10:38:32.780946 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-89gkq"] Mar 19 10:38:33 crc kubenswrapper[4765]: I0319 10:38:33.772816 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:33 crc kubenswrapper[4765]: I0319 10:38:33.920438 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-util\") pod \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " Mar 19 10:38:33 crc kubenswrapper[4765]: I0319 10:38:33.920839 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n8jch\" (UniqueName: \"kubernetes.io/projected/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-kube-api-access-n8jch\") pod \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " Mar 19 10:38:33 crc kubenswrapper[4765]: I0319 10:38:33.920948 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-bundle\") pod \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\" (UID: \"b340ebf3-8897-41c3-8a3e-733e4afc3fdf\") " Mar 19 10:38:33 crc kubenswrapper[4765]: I0319 10:38:33.922005 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-bundle" (OuterVolumeSpecName: "bundle") pod "b340ebf3-8897-41c3-8a3e-733e4afc3fdf" (UID: "b340ebf3-8897-41c3-8a3e-733e4afc3fdf"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:38:33 crc kubenswrapper[4765]: I0319 10:38:33.928688 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-kube-api-access-n8jch" (OuterVolumeSpecName: "kube-api-access-n8jch") pod "b340ebf3-8897-41c3-8a3e-733e4afc3fdf" (UID: "b340ebf3-8897-41c3-8a3e-733e4afc3fdf"). InnerVolumeSpecName "kube-api-access-n8jch". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.023080 4765 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.023140 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n8jch\" (UniqueName: \"kubernetes.io/projected/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-kube-api-access-n8jch\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.363919 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" path="/var/lib/kubelet/pods/c3eb272d-2bcb-461c-b8fc-a569462389c1/volumes" Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.470195 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" event={"ID":"b340ebf3-8897-41c3-8a3e-733e4afc3fdf","Type":"ContainerDied","Data":"99d541dd72f2d6597907f7838942fe8b1ac862a13229f262434d156b44651f51"} Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.470564 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="99d541dd72f2d6597907f7838942fe8b1ac862a13229f262434d156b44651f51" Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.470312 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck" Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.684115 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-util" (OuterVolumeSpecName: "util") pod "b340ebf3-8897-41c3-8a3e-733e4afc3fdf" (UID: "b340ebf3-8897-41c3-8a3e-733e4afc3fdf"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:38:34 crc kubenswrapper[4765]: I0319 10:38:34.734110 4765 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/b340ebf3-8897-41c3-8a3e-733e4afc3fdf-util\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:36 crc kubenswrapper[4765]: I0319 10:38:36.502975 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:36 crc kubenswrapper[4765]: I0319 10:38:36.503514 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:36 crc kubenswrapper[4765]: I0319 10:38:36.562366 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.532138 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.639071 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th"] Mar 19 10:38:37 crc kubenswrapper[4765]: E0319 10:38:37.639610 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerName="pull" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.639684 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerName="pull" Mar 19 10:38:37 crc kubenswrapper[4765]: E0319 10:38:37.639770 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="extract-content" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.639828 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="extract-content" Mar 19 10:38:37 crc kubenswrapper[4765]: E0319 10:38:37.639885 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="registry-server" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.639943 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="registry-server" Mar 19 10:38:37 crc kubenswrapper[4765]: E0319 10:38:37.640039 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerName="extract" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.640095 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerName="extract" Mar 19 10:38:37 crc kubenswrapper[4765]: E0319 10:38:37.640151 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerName="util" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.640207 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerName="util" Mar 19 10:38:37 crc kubenswrapper[4765]: E0319 10:38:37.640279 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="extract-utilities" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.640372 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="extract-utilities" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.640546 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3eb272d-2bcb-461c-b8fc-a569462389c1" containerName="registry-server" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.640618 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b340ebf3-8897-41c3-8a3e-733e4afc3fdf" containerName="extract" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.641117 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.645837 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-k8pxx" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.684492 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th"] Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.780168 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btps2\" (UniqueName: \"kubernetes.io/projected/538ce45d-9424-41e4-8d9e-ff63db0df6be-kube-api-access-btps2\") pod \"openstack-operator-controller-init-76ccd786f6-h42th\" (UID: \"538ce45d-9424-41e4-8d9e-ff63db0df6be\") " pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.881625 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btps2\" (UniqueName: \"kubernetes.io/projected/538ce45d-9424-41e4-8d9e-ff63db0df6be-kube-api-access-btps2\") pod \"openstack-operator-controller-init-76ccd786f6-h42th\" (UID: \"538ce45d-9424-41e4-8d9e-ff63db0df6be\") " pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.916018 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btps2\" (UniqueName: \"kubernetes.io/projected/538ce45d-9424-41e4-8d9e-ff63db0df6be-kube-api-access-btps2\") pod \"openstack-operator-controller-init-76ccd786f6-h42th\" (UID: \"538ce45d-9424-41e4-8d9e-ff63db0df6be\") " pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" Mar 19 10:38:37 crc kubenswrapper[4765]: I0319 10:38:37.962894 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" Mar 19 10:38:38 crc kubenswrapper[4765]: I0319 10:38:38.226364 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th"] Mar 19 10:38:38 crc kubenswrapper[4765]: I0319 10:38:38.501454 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" event={"ID":"538ce45d-9424-41e4-8d9e-ff63db0df6be","Type":"ContainerStarted","Data":"68d92fcad0e27f7f326e09af2918394c55837c91671fd9644a0ee09776c79353"} Mar 19 10:38:38 crc kubenswrapper[4765]: I0319 10:38:38.978099 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-275fr"] Mar 19 10:38:39 crc kubenswrapper[4765]: I0319 10:38:39.507373 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-275fr" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="registry-server" containerID="cri-o://1493717ead4c1435c17d84ca8e9aabfd9b4b727d35b9ca5186fb63c2fb2dd2ea" gracePeriod=2 Mar 19 10:38:40 crc kubenswrapper[4765]: I0319 10:38:40.519004 4765 generic.go:334] "Generic (PLEG): container finished" podID="b5bb9882-1699-4616-88b3-986070d06298" containerID="1493717ead4c1435c17d84ca8e9aabfd9b4b727d35b9ca5186fb63c2fb2dd2ea" exitCode=0 Mar 19 10:38:40 crc kubenswrapper[4765]: I0319 10:38:40.519063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-275fr" event={"ID":"b5bb9882-1699-4616-88b3-986070d06298","Type":"ContainerDied","Data":"1493717ead4c1435c17d84ca8e9aabfd9b4b727d35b9ca5186fb63c2fb2dd2ea"} Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.262350 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.375975 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-utilities\") pod \"b5bb9882-1699-4616-88b3-986070d06298\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.376092 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6zlgb\" (UniqueName: \"kubernetes.io/projected/b5bb9882-1699-4616-88b3-986070d06298-kube-api-access-6zlgb\") pod \"b5bb9882-1699-4616-88b3-986070d06298\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.376173 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-catalog-content\") pod \"b5bb9882-1699-4616-88b3-986070d06298\" (UID: \"b5bb9882-1699-4616-88b3-986070d06298\") " Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.377329 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-utilities" (OuterVolumeSpecName: "utilities") pod "b5bb9882-1699-4616-88b3-986070d06298" (UID: "b5bb9882-1699-4616-88b3-986070d06298"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.384192 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5bb9882-1699-4616-88b3-986070d06298-kube-api-access-6zlgb" (OuterVolumeSpecName: "kube-api-access-6zlgb") pod "b5bb9882-1699-4616-88b3-986070d06298" (UID: "b5bb9882-1699-4616-88b3-986070d06298"). InnerVolumeSpecName "kube-api-access-6zlgb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.422731 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b5bb9882-1699-4616-88b3-986070d06298" (UID: "b5bb9882-1699-4616-88b3-986070d06298"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.478007 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.478049 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b5bb9882-1699-4616-88b3-986070d06298-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.478068 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6zlgb\" (UniqueName: \"kubernetes.io/projected/b5bb9882-1699-4616-88b3-986070d06298-kube-api-access-6zlgb\") on node \"crc\" DevicePath \"\"" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.539448 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-275fr" event={"ID":"b5bb9882-1699-4616-88b3-986070d06298","Type":"ContainerDied","Data":"edbba0208df11095870477bc00d08ae15271829c541bb877f2b5e306de21c838"} Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.539516 4765 scope.go:117] "RemoveContainer" containerID="1493717ead4c1435c17d84ca8e9aabfd9b4b727d35b9ca5186fb63c2fb2dd2ea" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.539541 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-275fr" Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.573162 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-275fr"] Mar 19 10:38:42 crc kubenswrapper[4765]: I0319 10:38:42.582992 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-275fr"] Mar 19 10:38:43 crc kubenswrapper[4765]: I0319 10:38:43.020095 4765 scope.go:117] "RemoveContainer" containerID="96a65f207ad8e02366d3481f11cd82724013ad6566a214aacb391606ba3a7559" Mar 19 10:38:43 crc kubenswrapper[4765]: I0319 10:38:43.083941 4765 scope.go:117] "RemoveContainer" containerID="4c9031c95738c9c9340fa57401e2608b02ef11d128345068394699ced83bf4af" Mar 19 10:38:43 crc kubenswrapper[4765]: I0319 10:38:43.132458 4765 scope.go:117] "RemoveContainer" containerID="be98697021637e1660b583011ff0716a03df813a35ade1eaf35533366e32d658" Mar 19 10:38:43 crc kubenswrapper[4765]: I0319 10:38:43.547783 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" event={"ID":"538ce45d-9424-41e4-8d9e-ff63db0df6be","Type":"ContainerStarted","Data":"d20d89a505c6580460f9e8872af7f3ea814ad00c6199711c5ae68a350be489fa"} Mar 19 10:38:43 crc kubenswrapper[4765]: I0319 10:38:43.549237 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" Mar 19 10:38:44 crc kubenswrapper[4765]: I0319 10:38:44.364648 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5bb9882-1699-4616-88b3-986070d06298" path="/var/lib/kubelet/pods/b5bb9882-1699-4616-88b3-986070d06298/volumes" Mar 19 10:38:57 crc kubenswrapper[4765]: I0319 10:38:57.974705 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" Mar 19 10:38:58 crc kubenswrapper[4765]: I0319 10:38:58.007536 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-76ccd786f6-h42th" podStartSLOduration=16.07958554 podStartE2EDuration="21.007511084s" podCreationTimestamp="2026-03-19 10:38:37 +0000 UTC" firstStartedPulling="2026-03-19 10:38:38.242868039 +0000 UTC m=+1016.591813591" lastFinishedPulling="2026-03-19 10:38:43.170793583 +0000 UTC m=+1021.519739135" observedRunningTime="2026-03-19 10:38:43.577670119 +0000 UTC m=+1021.926615681" watchObservedRunningTime="2026-03-19 10:38:58.007511084 +0000 UTC m=+1036.356456646" Mar 19 10:39:01 crc kubenswrapper[4765]: I0319 10:39:01.655876 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:39:01 crc kubenswrapper[4765]: I0319 10:39:01.656407 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.726700 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2"] Mar 19 10:39:27 crc kubenswrapper[4765]: E0319 10:39:27.728869 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="registry-server" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.728952 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="registry-server" Mar 19 10:39:27 crc kubenswrapper[4765]: E0319 10:39:27.729054 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="extract-utilities" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.729113 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="extract-utilities" Mar 19 10:39:27 crc kubenswrapper[4765]: E0319 10:39:27.729178 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="extract-content" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.729233 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="extract-content" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.729420 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b5bb9882-1699-4616-88b3-986070d06298" containerName="registry-server" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.729978 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.732481 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-4jvtt" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.742157 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.756373 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.757444 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.765650 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-w6hkf" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.776208 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.777183 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.780595 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-mjptg" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.820339 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.838721 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lt2vw\" (UniqueName: \"kubernetes.io/projected/1954f819-78c2-46fd-a6bf-c626d50ef527-kube-api-access-lt2vw\") pod \"barbican-operator-controller-manager-59bc569d95-kvrz2\" (UID: \"1954f819-78c2-46fd-a6bf-c626d50ef527\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.838782 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gsr8d\" (UniqueName: \"kubernetes.io/projected/1d54708d-8829-411d-a632-ce3b53b7aeaa-kube-api-access-gsr8d\") pod \"cinder-operator-controller-manager-8d58dc466-ncc44\" (UID: \"1d54708d-8829-411d-a632-ce3b53b7aeaa\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.848608 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.849729 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.854281 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-7s92z" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.861511 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.871025 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.890051 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.891217 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.893845 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-fbtkt" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.900233 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.905179 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.906414 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.909146 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-gv8lk" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.921247 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.929165 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.930417 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.933773 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-7qq4v" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.940081 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dgmnj\" (UniqueName: \"kubernetes.io/projected/c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01-kube-api-access-dgmnj\") pod \"designate-operator-controller-manager-588d4d986b-6h4tg\" (UID: \"c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.940175 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qp8bs\" (UniqueName: \"kubernetes.io/projected/123b9f81-d315-44b3-a6ec-d777cc18ab7b-kube-api-access-qp8bs\") pod \"heat-operator-controller-manager-67dd5f86f5-4zf8v\" (UID: \"123b9f81-d315-44b3-a6ec-d777cc18ab7b\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.940238 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lt2vw\" (UniqueName: \"kubernetes.io/projected/1954f819-78c2-46fd-a6bf-c626d50ef527-kube-api-access-lt2vw\") pod \"barbican-operator-controller-manager-59bc569d95-kvrz2\" (UID: \"1954f819-78c2-46fd-a6bf-c626d50ef527\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.940311 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gsr8d\" (UniqueName: \"kubernetes.io/projected/1d54708d-8829-411d-a632-ce3b53b7aeaa-kube-api-access-gsr8d\") pod \"cinder-operator-controller-manager-8d58dc466-ncc44\" (UID: \"1d54708d-8829-411d-a632-ce3b53b7aeaa\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.977155 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht"] Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.978053 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.983377 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.984259 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gsr8d\" (UniqueName: \"kubernetes.io/projected/1d54708d-8829-411d-a632-ce3b53b7aeaa-kube-api-access-gsr8d\") pod \"cinder-operator-controller-manager-8d58dc466-ncc44\" (UID: \"1d54708d-8829-411d-a632-ce3b53b7aeaa\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.996207 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nmxkk" Mar 19 10:39:27 crc kubenswrapper[4765]: I0319 10:39:27.996239 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lt2vw\" (UniqueName: \"kubernetes.io/projected/1954f819-78c2-46fd-a6bf-c626d50ef527-kube-api-access-lt2vw\") pod \"barbican-operator-controller-manager-59bc569d95-kvrz2\" (UID: \"1954f819-78c2-46fd-a6bf-c626d50ef527\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.035796 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.037926 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.046020 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dgmnj\" (UniqueName: \"kubernetes.io/projected/c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01-kube-api-access-dgmnj\") pod \"designate-operator-controller-manager-588d4d986b-6h4tg\" (UID: \"c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.046084 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsr2d\" (UniqueName: \"kubernetes.io/projected/3e1ee5ea-abd4-4a73-840e-43fbd3732cfd-kube-api-access-zsr2d\") pod \"glance-operator-controller-manager-79df6bcc97-kzvs8\" (UID: \"3e1ee5ea-abd4-4a73-840e-43fbd3732cfd\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.046119 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qn9r\" (UniqueName: \"kubernetes.io/projected/047a8026-b206-4eb6-9630-3b550af68d3a-kube-api-access-7qn9r\") pod \"horizon-operator-controller-manager-8464cc45fb-qrqf8\" (UID: \"047a8026-b206-4eb6-9630-3b550af68d3a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.046141 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mpq7\" (UniqueName: \"kubernetes.io/projected/cdeba207-ced7-4575-9c08-c001d85b0a93-kube-api-access-9mpq7\") pod \"ironic-operator-controller-manager-6f787dddc9-p27jn\" (UID: \"cdeba207-ced7-4575-9c08-c001d85b0a93\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.046169 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qp8bs\" (UniqueName: \"kubernetes.io/projected/123b9f81-d315-44b3-a6ec-d777cc18ab7b-kube-api-access-qp8bs\") pod \"heat-operator-controller-manager-67dd5f86f5-4zf8v\" (UID: \"123b9f81-d315-44b3-a6ec-d777cc18ab7b\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.046929 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-26kdn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.047225 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.074718 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qp8bs\" (UniqueName: \"kubernetes.io/projected/123b9f81-d315-44b3-a6ec-d777cc18ab7b-kube-api-access-qp8bs\") pod \"heat-operator-controller-manager-67dd5f86f5-4zf8v\" (UID: \"123b9f81-d315-44b3-a6ec-d777cc18ab7b\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.074898 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.079718 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.087421 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dgmnj\" (UniqueName: \"kubernetes.io/projected/c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01-kube-api-access-dgmnj\") pod \"designate-operator-controller-manager-588d4d986b-6h4tg\" (UID: \"c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.099709 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.105776 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.127142 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.137009 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.137843 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.140117 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-5vt8p" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.147902 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.148296 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ftdjt\" (UniqueName: \"kubernetes.io/projected/0cd862fe-c896-4fa6-a9ba-b1af6441f777-kube-api-access-ftdjt\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.148431 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dghnl\" (UniqueName: \"kubernetes.io/projected/b94397e1-cedc-4048-9253-12c60b0a9bfd-kube-api-access-dghnl\") pod \"keystone-operator-controller-manager-768b96df4c-q9lpg\" (UID: \"b94397e1-cedc-4048-9253-12c60b0a9bfd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.148557 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zsr2d\" (UniqueName: \"kubernetes.io/projected/3e1ee5ea-abd4-4a73-840e-43fbd3732cfd-kube-api-access-zsr2d\") pod \"glance-operator-controller-manager-79df6bcc97-kzvs8\" (UID: \"3e1ee5ea-abd4-4a73-840e-43fbd3732cfd\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.148674 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qn9r\" (UniqueName: \"kubernetes.io/projected/047a8026-b206-4eb6-9630-3b550af68d3a-kube-api-access-7qn9r\") pod \"horizon-operator-controller-manager-8464cc45fb-qrqf8\" (UID: \"047a8026-b206-4eb6-9630-3b550af68d3a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.148772 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9mpq7\" (UniqueName: \"kubernetes.io/projected/cdeba207-ced7-4575-9c08-c001d85b0a93-kube-api-access-9mpq7\") pod \"ironic-operator-controller-manager-6f787dddc9-p27jn\" (UID: \"cdeba207-ced7-4575-9c08-c001d85b0a93\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.171620 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsr2d\" (UniqueName: \"kubernetes.io/projected/3e1ee5ea-abd4-4a73-840e-43fbd3732cfd-kube-api-access-zsr2d\") pod \"glance-operator-controller-manager-79df6bcc97-kzvs8\" (UID: \"3e1ee5ea-abd4-4a73-840e-43fbd3732cfd\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.171638 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mpq7\" (UniqueName: \"kubernetes.io/projected/cdeba207-ced7-4575-9c08-c001d85b0a93-kube-api-access-9mpq7\") pod \"ironic-operator-controller-manager-6f787dddc9-p27jn\" (UID: \"cdeba207-ced7-4575-9c08-c001d85b0a93\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.171690 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.171939 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.173759 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.175781 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-nfmsr" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.179242 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qn9r\" (UniqueName: \"kubernetes.io/projected/047a8026-b206-4eb6-9630-3b550af68d3a-kube-api-access-7qn9r\") pod \"horizon-operator-controller-manager-8464cc45fb-qrqf8\" (UID: \"047a8026-b206-4eb6-9630-3b550af68d3a\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.180506 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-78t28"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.181758 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.187254 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.188335 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.188908 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-nchmh" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.191723 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-c6sjh" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.197632 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.211151 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.216257 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.218875 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-78t28"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.237111 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.247990 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.250225 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zrs77\" (UniqueName: \"kubernetes.io/projected/6647190b-c26b-4c57-bc84-7e5cfe6a5649-kube-api-access-zrs77\") pod \"nova-operator-controller-manager-5d488d59fb-wxsgc\" (UID: \"6647190b-c26b-4c57-bc84-7e5cfe6a5649\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.250268 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dghnl\" (UniqueName: \"kubernetes.io/projected/b94397e1-cedc-4048-9253-12c60b0a9bfd-kube-api-access-dghnl\") pod \"keystone-operator-controller-manager-768b96df4c-q9lpg\" (UID: \"b94397e1-cedc-4048-9253-12c60b0a9bfd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.250315 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbbjw\" (UniqueName: \"kubernetes.io/projected/9225dfe1-877e-43a2-9034-0e355019aa04-kube-api-access-wbbjw\") pod \"manila-operator-controller-manager-55f864c847-x8k5g\" (UID: \"9225dfe1-877e-43a2-9034-0e355019aa04\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.250377 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4r6wh\" (UniqueName: \"kubernetes.io/projected/adc31858-63eb-4d03-b79c-c1a4054725af-kube-api-access-4r6wh\") pod \"neutron-operator-controller-manager-767865f676-78t28\" (UID: \"adc31858-63eb-4d03-b79c-c1a4054725af\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.250411 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.250436 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8czz9\" (UniqueName: \"kubernetes.io/projected/d9dca6f4-a577-44fa-a959-8398fb57dca0-kube-api-access-8czz9\") pod \"mariadb-operator-controller-manager-67ccfc9778-bg8b9\" (UID: \"d9dca6f4-a577-44fa-a959-8398fb57dca0\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.250462 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ftdjt\" (UniqueName: \"kubernetes.io/projected/0cd862fe-c896-4fa6-a9ba-b1af6441f777-kube-api-access-ftdjt\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.251510 4765 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.251558 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert podName:0cd862fe-c896-4fa6-a9ba-b1af6441f777 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:28.751541314 +0000 UTC m=+1067.100486846 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert") pod "infra-operator-controller-manager-7b9c774f96-2gnht" (UID: "0cd862fe-c896-4fa6-a9ba-b1af6441f777") : secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.254670 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.269090 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ftdjt\" (UniqueName: \"kubernetes.io/projected/0cd862fe-c896-4fa6-a9ba-b1af6441f777-kube-api-access-ftdjt\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.278921 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dghnl\" (UniqueName: \"kubernetes.io/projected/b94397e1-cedc-4048-9253-12c60b0a9bfd-kube-api-access-dghnl\") pod \"keystone-operator-controller-manager-768b96df4c-q9lpg\" (UID: \"b94397e1-cedc-4048-9253-12c60b0a9bfd\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.319877 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.327590 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.342527 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-khzzv" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.347183 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-v78t5"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.348721 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.350581 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-sg6tj" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.352811 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbbjw\" (UniqueName: \"kubernetes.io/projected/9225dfe1-877e-43a2-9034-0e355019aa04-kube-api-access-wbbjw\") pod \"manila-operator-controller-manager-55f864c847-x8k5g\" (UID: \"9225dfe1-877e-43a2-9034-0e355019aa04\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.352892 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4r6wh\" (UniqueName: \"kubernetes.io/projected/adc31858-63eb-4d03-b79c-c1a4054725af-kube-api-access-4r6wh\") pod \"neutron-operator-controller-manager-767865f676-78t28\" (UID: \"adc31858-63eb-4d03-b79c-c1a4054725af\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.352967 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8czz9\" (UniqueName: \"kubernetes.io/projected/d9dca6f4-a577-44fa-a959-8398fb57dca0-kube-api-access-8czz9\") pod \"mariadb-operator-controller-manager-67ccfc9778-bg8b9\" (UID: \"d9dca6f4-a577-44fa-a959-8398fb57dca0\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.353053 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zrs77\" (UniqueName: \"kubernetes.io/projected/6647190b-c26b-4c57-bc84-7e5cfe6a5649-kube-api-access-zrs77\") pod \"nova-operator-controller-manager-5d488d59fb-wxsgc\" (UID: \"6647190b-c26b-4c57-bc84-7e5cfe6a5649\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.378694 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4r6wh\" (UniqueName: \"kubernetes.io/projected/adc31858-63eb-4d03-b79c-c1a4054725af-kube-api-access-4r6wh\") pod \"neutron-operator-controller-manager-767865f676-78t28\" (UID: \"adc31858-63eb-4d03-b79c-c1a4054725af\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.388707 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8czz9\" (UniqueName: \"kubernetes.io/projected/d9dca6f4-a577-44fa-a959-8398fb57dca0-kube-api-access-8czz9\") pod \"mariadb-operator-controller-manager-67ccfc9778-bg8b9\" (UID: \"d9dca6f4-a577-44fa-a959-8398fb57dca0\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.394146 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zrs77\" (UniqueName: \"kubernetes.io/projected/6647190b-c26b-4c57-bc84-7e5cfe6a5649-kube-api-access-zrs77\") pod \"nova-operator-controller-manager-5d488d59fb-wxsgc\" (UID: \"6647190b-c26b-4c57-bc84-7e5cfe6a5649\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.400922 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.403498 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.405064 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbbjw\" (UniqueName: \"kubernetes.io/projected/9225dfe1-877e-43a2-9034-0e355019aa04-kube-api-access-wbbjw\") pod \"manila-operator-controller-manager-55f864c847-x8k5g\" (UID: \"9225dfe1-877e-43a2-9034-0e355019aa04\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.414948 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.417620 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.417797 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xtgsp" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.417908 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-v78t5"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.427741 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.428857 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.434911 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-f8fsg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.435115 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-hfm25"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.435980 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.443291 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-thr56" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.443523 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.449039 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.455130 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-hfm25"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.462681 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6wdfh\" (UniqueName: \"kubernetes.io/projected/473e9670-e72d-4e54-8b06-9d73666cbfc0-kube-api-access-6wdfh\") pod \"octavia-operator-controller-manager-5b9f45d989-b4kfn\" (UID: \"473e9670-e72d-4e54-8b06-9d73666cbfc0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.463074 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dzdwh\" (UniqueName: \"kubernetes.io/projected/412ddd32-a861-4cec-8d5e-bb21069835e9-kube-api-access-dzdwh\") pod \"ovn-operator-controller-manager-884679f54-v78t5\" (UID: \"412ddd32-a861-4cec-8d5e-bb21069835e9\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.475867 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.476571 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.493112 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-8dxx2" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.493238 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.508227 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.512720 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.517537 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-jpwms" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.528568 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.532571 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.554836 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.572799 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.580602 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.582005 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.596807 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-pnfrn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.606260 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.607305 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-78swp\" (UniqueName: \"kubernetes.io/projected/4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d-kube-api-access-78swp\") pod \"placement-operator-controller-manager-5784578c99-2zbqz\" (UID: \"4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.607607 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.607727 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-trj6r\" (UniqueName: \"kubernetes.io/projected/ae2caf34-b7b2-486c-9e9e-a48cf04eed87-kube-api-access-trj6r\") pod \"telemetry-operator-controller-manager-d6b694c5-7wsnh\" (UID: \"ae2caf34-b7b2-486c-9e9e-a48cf04eed87\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.607930 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6wdfh\" (UniqueName: \"kubernetes.io/projected/473e9670-e72d-4e54-8b06-9d73666cbfc0-kube-api-access-6wdfh\") pod \"octavia-operator-controller-manager-5b9f45d989-b4kfn\" (UID: \"473e9670-e72d-4e54-8b06-9d73666cbfc0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.608117 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zn6k2\" (UniqueName: \"kubernetes.io/projected/c3466125-06fe-4c5d-872d-a778806a0e23-kube-api-access-zn6k2\") pod \"swift-operator-controller-manager-c674c5965-hfm25\" (UID: \"c3466125-06fe-4c5d-872d-a778806a0e23\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.608740 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dzdwh\" (UniqueName: \"kubernetes.io/projected/412ddd32-a861-4cec-8d5e-bb21069835e9-kube-api-access-dzdwh\") pod \"ovn-operator-controller-manager-884679f54-v78t5\" (UID: \"412ddd32-a861-4cec-8d5e-bb21069835e9\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.608891 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7vh2\" (UniqueName: \"kubernetes.io/projected/981806c8-2390-44ac-a6f8-81c5f5bb0374-kube-api-access-w7vh2\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.682311 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.682626 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.683425 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dzdwh\" (UniqueName: \"kubernetes.io/projected/412ddd32-a861-4cec-8d5e-bb21069835e9-kube-api-access-dzdwh\") pod \"ovn-operator-controller-manager-884679f54-v78t5\" (UID: \"412ddd32-a861-4cec-8d5e-bb21069835e9\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.683671 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6wdfh\" (UniqueName: \"kubernetes.io/projected/473e9670-e72d-4e54-8b06-9d73666cbfc0-kube-api-access-6wdfh\") pod \"octavia-operator-controller-manager-5b9f45d989-b4kfn\" (UID: \"473e9670-e72d-4e54-8b06-9d73666cbfc0\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.702367 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.710065 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.710123 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-trj6r\" (UniqueName: \"kubernetes.io/projected/ae2caf34-b7b2-486c-9e9e-a48cf04eed87-kube-api-access-trj6r\") pod \"telemetry-operator-controller-manager-d6b694c5-7wsnh\" (UID: \"ae2caf34-b7b2-486c-9e9e-a48cf04eed87\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.710207 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zn6k2\" (UniqueName: \"kubernetes.io/projected/c3466125-06fe-4c5d-872d-a778806a0e23-kube-api-access-zn6k2\") pod \"swift-operator-controller-manager-c674c5965-hfm25\" (UID: \"c3466125-06fe-4c5d-872d-a778806a0e23\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.710242 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6f9xt\" (UniqueName: \"kubernetes.io/projected/3c3ef321-6a40-4d2e-a414-ad6a65cd32cf-kube-api-access-6f9xt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-dbv47\" (UID: \"3c3ef321-6a40-4d2e-a414-ad6a65cd32cf\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.710271 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7vh2\" (UniqueName: \"kubernetes.io/projected/981806c8-2390-44ac-a6f8-81c5f5bb0374-kube-api-access-w7vh2\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.710315 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q7s5\" (UniqueName: \"kubernetes.io/projected/e843ba99-8859-41d7-9142-ed9227b4d8e1-kube-api-access-8q7s5\") pod \"test-operator-controller-manager-5c5cb9c4d7-xdlrz\" (UID: \"e843ba99-8859-41d7-9142-ed9227b4d8e1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.710354 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-78swp\" (UniqueName: \"kubernetes.io/projected/4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d-kube-api-access-78swp\") pod \"placement-operator-controller-manager-5784578c99-2zbqz\" (UID: \"4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.710688 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.710766 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert podName:981806c8-2390-44ac-a6f8-81c5f5bb0374 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:29.210745199 +0000 UTC m=+1067.559690741 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-vqghn" (UID: "981806c8-2390-44ac-a6f8-81c5f5bb0374") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.720656 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.721700 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.727161 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.729587 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.739152 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7vh2\" (UniqueName: \"kubernetes.io/projected/981806c8-2390-44ac-a6f8-81c5f5bb0374-kube-api-access-w7vh2\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.739557 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8vsps" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.744086 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.788522 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zn6k2\" (UniqueName: \"kubernetes.io/projected/c3466125-06fe-4c5d-872d-a778806a0e23-kube-api-access-zn6k2\") pod \"swift-operator-controller-manager-c674c5965-hfm25\" (UID: \"c3466125-06fe-4c5d-872d-a778806a0e23\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.794710 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-78swp\" (UniqueName: \"kubernetes.io/projected/4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d-kube-api-access-78swp\") pod \"placement-operator-controller-manager-5784578c99-2zbqz\" (UID: \"4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.798075 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.799144 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.820497 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs"] Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.824395 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-controller-manager-dockercfg-q46zg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.825414 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.826613 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6f9xt\" (UniqueName: \"kubernetes.io/projected/3c3ef321-6a40-4d2e-a414-ad6a65cd32cf-kube-api-access-6f9xt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-dbv47\" (UID: \"3c3ef321-6a40-4d2e-a414-ad6a65cd32cf\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.826657 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.826709 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q7s5\" (UniqueName: \"kubernetes.io/projected/e843ba99-8859-41d7-9142-ed9227b4d8e1-kube-api-access-8q7s5\") pod \"test-operator-controller-manager-5c5cb9c4d7-xdlrz\" (UID: \"e843ba99-8859-41d7-9142-ed9227b4d8e1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.826789 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.826820 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4sxw\" (UniqueName: \"kubernetes.io/projected/970bc693-0463-4dfe-8870-fac695fffcae-kube-api-access-r4sxw\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.826869 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.827034 4765 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.827091 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert podName:0cd862fe-c896-4fa6-a9ba-b1af6441f777 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:29.827073253 +0000 UTC m=+1068.176018795 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert") pod "infra-operator-controller-manager-7b9c774f96-2gnht" (UID: "0cd862fe-c896-4fa6-a9ba-b1af6441f777") : secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.843645 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.880331 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-trj6r\" (UniqueName: \"kubernetes.io/projected/ae2caf34-b7b2-486c-9e9e-a48cf04eed87-kube-api-access-trj6r\") pod \"telemetry-operator-controller-manager-d6b694c5-7wsnh\" (UID: \"ae2caf34-b7b2-486c-9e9e-a48cf04eed87\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.893439 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6f9xt\" (UniqueName: \"kubernetes.io/projected/3c3ef321-6a40-4d2e-a414-ad6a65cd32cf-kube-api-access-6f9xt\") pod \"watcher-operator-controller-manager-6c4d75f7f9-dbv47\" (UID: \"3c3ef321-6a40-4d2e-a414-ad6a65cd32cf\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.897563 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q7s5\" (UniqueName: \"kubernetes.io/projected/e843ba99-8859-41d7-9142-ed9227b4d8e1-kube-api-access-8q7s5\") pod \"test-operator-controller-manager-5c5cb9c4d7-xdlrz\" (UID: \"e843ba99-8859-41d7-9142-ed9227b4d8e1\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.928184 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h52m\" (UniqueName: \"kubernetes.io/projected/408f748b-ca2b-4ae8-8994-63d7da422df9-kube-api-access-8h52m\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hsxfs\" (UID: \"408f748b-ca2b-4ae8-8994-63d7da422df9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.928495 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4sxw\" (UniqueName: \"kubernetes.io/projected/970bc693-0463-4dfe-8870-fac695fffcae-kube-api-access-r4sxw\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.928711 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.928881 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.929097 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.929562 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:29.429528762 +0000 UTC m=+1067.778474304 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "webhook-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.929190 4765 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: E0319 10:39:28.930024 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:29.430014515 +0000 UTC m=+1067.778960047 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "metrics-server-cert" not found Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.947522 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.949737 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4sxw\" (UniqueName: \"kubernetes.io/projected/970bc693-0463-4dfe-8870-fac695fffcae-kube-api-access-r4sxw\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.952329 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" Mar 19 10:39:28 crc kubenswrapper[4765]: I0319 10:39:28.978505 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.034875 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h52m\" (UniqueName: \"kubernetes.io/projected/408f748b-ca2b-4ae8-8994-63d7da422df9-kube-api-access-8h52m\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hsxfs\" (UID: \"408f748b-ca2b-4ae8-8994-63d7da422df9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.062406 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h52m\" (UniqueName: \"kubernetes.io/projected/408f748b-ca2b-4ae8-8994-63d7da422df9-kube-api-access-8h52m\") pod \"rabbitmq-cluster-operator-manager-668c99d594-hsxfs\" (UID: \"408f748b-ca2b-4ae8-8994-63d7da422df9\") " pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.133919 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.140230 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.159289 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod123b9f81_d315_44b3_a6ec_d777cc18ab7b.slice/crio-348e49faa0444cc0948dbd3e24e8f321a497f7135585e0eaa3920dc9ff1d867e WatchSource:0}: Error finding container 348e49faa0444cc0948dbd3e24e8f321a497f7135585e0eaa3920dc9ff1d867e: Status 404 returned error can't find the container with id 348e49faa0444cc0948dbd3e24e8f321a497f7135585e0eaa3920dc9ff1d867e Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.190033 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.241536 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.241725 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.242761 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert podName:981806c8-2390-44ac-a6f8-81c5f5bb0374 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:30.242727387 +0000 UTC m=+1068.591672929 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-vqghn" (UID: "981806c8-2390-44ac-a6f8-81c5f5bb0374") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.250175 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.254676 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.281746 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.290283 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.295917 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1954f819_78c2_46fd_a6bf_c626d50ef527.slice/crio-40d41f841c4bbd3c9f2cf6fb26a56c2a98661fe683f15e3a1828771a45159fd1 WatchSource:0}: Error finding container 40d41f841c4bbd3c9f2cf6fb26a56c2a98661fe683f15e3a1828771a45159fd1: Status 404 returned error can't find the container with id 40d41f841c4bbd3c9f2cf6fb26a56c2a98661fe683f15e3a1828771a45159fd1 Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.302161 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc5e9a9c2_b7c8_4e1a_8c60_e6fcb0616e01.slice/crio-c642405fbe002cd495c30184429cc392afb7ca1fee5296849b0c8cf2c7871317 WatchSource:0}: Error finding container c642405fbe002cd495c30184429cc392afb7ca1fee5296849b0c8cf2c7871317: Status 404 returned error can't find the container with id c642405fbe002cd495c30184429cc392afb7ca1fee5296849b0c8cf2c7871317 Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.406817 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.416428 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9dca6f4_a577_44fa_a959_8398fb57dca0.slice/crio-5b0608f3239239a1cf0c69767c748f1902a80db93e6c190e4c06cfe3490c730a WatchSource:0}: Error finding container 5b0608f3239239a1cf0c69767c748f1902a80db93e6c190e4c06cfe3490c730a: Status 404 returned error can't find the container with id 5b0608f3239239a1cf0c69767c748f1902a80db93e6c190e4c06cfe3490c730a Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.447586 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.447705 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.447905 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.447993 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:30.447968893 +0000 UTC m=+1068.796914435 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "webhook-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.448442 4765 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.448475 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:30.448466747 +0000 UTC m=+1068.797412279 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "metrics-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.609189 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.626226 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.632090 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcdeba207_ced7_4575_9c08_c001d85b0a93.slice/crio-7eefdf8f7e9e661d272d6fb0a620180a19fa40f9f1363e41821434520eea19dc WatchSource:0}: Error finding container 7eefdf8f7e9e661d272d6fb0a620180a19fa40f9f1363e41821434520eea19dc: Status 404 returned error can't find the container with id 7eefdf8f7e9e661d272d6fb0a620180a19fa40f9f1363e41821434520eea19dc Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.637327 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.645528 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-78t28"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.651104 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.657522 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9225dfe1_877e_43a2_9034_0e355019aa04.slice/crio-122c218c01f638f56fb3750ced4c174726f67e9f21fc171cea15f193c7700603 WatchSource:0}: Error finding container 122c218c01f638f56fb3750ced4c174726f67e9f21fc171cea15f193c7700603: Status 404 returned error can't find the container with id 122c218c01f638f56fb3750ced4c174726f67e9f21fc171cea15f193c7700603 Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.712752 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.730214 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6647190b_c26b_4c57_bc84_7e5cfe6a5649.slice/crio-bb1f09bed63e2ba5f613c386fe274fb04b418e8bcd8a35626d9cc665ef5e9f74 WatchSource:0}: Error finding container bb1f09bed63e2ba5f613c386fe274fb04b418e8bcd8a35626d9cc665ef5e9f74: Status 404 returned error can't find the container with id bb1f09bed63e2ba5f613c386fe274fb04b418e8bcd8a35626d9cc665ef5e9f74 Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.732243 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.744927 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-hfm25"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.748885 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4be8b0ab_5eda_4bf9_8fc4_20dcbe9c406d.slice/crio-8c6e75decfd3704ce071fe940e5b458a931547906f32072f7dafe777c13748bd WatchSource:0}: Error finding container 8c6e75decfd3704ce071fe940e5b458a931547906f32072f7dafe777c13748bd: Status 404 returned error can't find the container with id 8c6e75decfd3704ce071fe940e5b458a931547906f32072f7dafe777c13748bd Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.750977 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-78swp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod placement-operator-controller-manager-5784578c99-2zbqz_openstack-operators(4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.751227 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod412ddd32_a861_4cec_8d5e_bb21069835e9.slice/crio-ef047de0f07549ce64abaaa43ade11adcd10ccd1120b5f030c1c86d51bb8dd30 WatchSource:0}: Error finding container ef047de0f07549ce64abaaa43ade11adcd10ccd1120b5f030c1c86d51bb8dd30: Status 404 returned error can't find the container with id ef047de0f07549ce64abaaa43ade11adcd10ccd1120b5f030c1c86d51bb8dd30 Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.751764 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-v78t5"] Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.752315 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" podUID="4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.754462 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-dzdwh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-v78t5_openstack-operators(412ddd32-a861-4cec-8d5e-bb21069835e9): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.755265 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3466125_06fe_4c5d_872d_a778806a0e23.slice/crio-7da1864dcb01c80768a61f800f7266ab864e30684f4ad0d283650f8c658e892e WatchSource:0}: Error finding container 7da1864dcb01c80768a61f800f7266ab864e30684f4ad0d283650f8c658e892e: Status 404 returned error can't find the container with id 7da1864dcb01c80768a61f800f7266ab864e30684f4ad0d283650f8c658e892e Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.755535 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" podUID="412ddd32-a861-4cec-8d5e-bb21069835e9" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.758225 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zn6k2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod swift-operator-controller-manager-c674c5965-hfm25_openstack-operators(c3466125-06fe-4c5d-872d-a778806a0e23): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.759480 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" podUID="c3466125-06fe-4c5d-872d-a778806a0e23" Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.847557 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podae2caf34_b7b2_486c_9e9e_a48cf04eed87.slice/crio-0a3b3bec382ebf1d4d2532ab7f26131531be098cf0963753f16c4374440c8dcb WatchSource:0}: Error finding container 0a3b3bec382ebf1d4d2532ab7f26131531be098cf0963753f16c4374440c8dcb: Status 404 returned error can't find the container with id 0a3b3bec382ebf1d4d2532ab7f26131531be098cf0963753f16c4374440c8dcb Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.849055 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh"] Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.853953 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-trj6r,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-7wsnh_openstack-operators(ae2caf34-b7b2-486c-9e9e-a48cf04eed87): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.855266 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" podUID="ae2caf34-b7b2-486c-9e9e-a48cf04eed87" Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.858756 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c3ef321_6a40_4d2e_a414_ad6a65cd32cf.slice/crio-45ae23be49ac2ea46016a49a98e4717e757b4a79a9d4a1f1c88ece9ce367c515 WatchSource:0}: Error finding container 45ae23be49ac2ea46016a49a98e4717e757b4a79a9d4a1f1c88ece9ce367c515: Status 404 returned error can't find the container with id 45ae23be49ac2ea46016a49a98e4717e757b4a79a9d4a1f1c88ece9ce367c515 Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.862507 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6f9xt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod watcher-operator-controller-manager-6c4d75f7f9-dbv47_openstack-operators(3c3ef321-6a40-4d2e-a414-ad6a65cd32cf): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.862678 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.862838 4765 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.862891 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert podName:0cd862fe-c896-4fa6-a9ba-b1af6441f777 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:31.862875376 +0000 UTC m=+1070.211820918 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert") pod "infra-operator-controller-manager-7b9c774f96-2gnht" (UID: "0cd862fe-c896-4fa6-a9ba-b1af6441f777") : secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.863732 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" podUID="3c3ef321-6a40-4d2e-a414-ad6a65cd32cf" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.864837 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.867273 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode843ba99_8859_41d7_9142_ed9227b4d8e1.slice/crio-7415af732225467c695cfc899093c71164cba12a8d1606924a1b8e148d5605ff WatchSource:0}: Error finding container 7415af732225467c695cfc899093c71164cba12a8d1606924a1b8e148d5605ff: Status 404 returned error can't find the container with id 7415af732225467c695cfc899093c71164cba12a8d1606924a1b8e148d5605ff Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.873417 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8q7s5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod test-operator-controller-manager-5c5cb9c4d7-xdlrz_openstack-operators(e843ba99-8859-41d7-9142-ed9227b4d8e1): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.874014 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" event={"ID":"3c3ef321-6a40-4d2e-a414-ad6a65cd32cf","Type":"ContainerStarted","Data":"45ae23be49ac2ea46016a49a98e4717e757b4a79a9d4a1f1c88ece9ce367c515"} Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.874722 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" podUID="e843ba99-8859-41d7-9142-ed9227b4d8e1" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.876563 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" podUID="3c3ef321-6a40-4d2e-a414-ad6a65cd32cf" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.902130 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.902226 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" event={"ID":"1d54708d-8829-411d-a632-ce3b53b7aeaa","Type":"ContainerStarted","Data":"ffc88eadb09064ce1ebcabd322f01fb5765c3e9dfcb23b1cbbd4db13429023d8"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.905158 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" event={"ID":"cdeba207-ced7-4575-9c08-c001d85b0a93","Type":"ContainerStarted","Data":"7eefdf8f7e9e661d272d6fb0a620180a19fa40f9f1363e41821434520eea19dc"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.906377 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" event={"ID":"1954f819-78c2-46fd-a6bf-c626d50ef527","Type":"ContainerStarted","Data":"40d41f841c4bbd3c9f2cf6fb26a56c2a98661fe683f15e3a1828771a45159fd1"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.908071 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" event={"ID":"b94397e1-cedc-4048-9253-12c60b0a9bfd","Type":"ContainerStarted","Data":"1822c9a771d27586d6614e1f7a24d10f5ce805f0c6d4b05d4693552157b8db3e"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.909472 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" event={"ID":"c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01","Type":"ContainerStarted","Data":"c642405fbe002cd495c30184429cc392afb7ca1fee5296849b0c8cf2c7871317"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.910610 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" event={"ID":"412ddd32-a861-4cec-8d5e-bb21069835e9","Type":"ContainerStarted","Data":"ef047de0f07549ce64abaaa43ade11adcd10ccd1120b5f030c1c86d51bb8dd30"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.911001 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz"] Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.911513 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" event={"ID":"9225dfe1-877e-43a2-9034-0e355019aa04","Type":"ContainerStarted","Data":"122c218c01f638f56fb3750ced4c174726f67e9f21fc171cea15f193c7700603"} Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.912607 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" podUID="412ddd32-a861-4cec-8d5e-bb21069835e9" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.912672 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" event={"ID":"3e1ee5ea-abd4-4a73-840e-43fbd3732cfd","Type":"ContainerStarted","Data":"d0bf9475901d5fe3513ee658fd15a9a41fceef5d3155fcdeca63e65722b766b9"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.914037 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" event={"ID":"d9dca6f4-a577-44fa-a959-8398fb57dca0","Type":"ContainerStarted","Data":"5b0608f3239239a1cf0c69767c748f1902a80db93e6c190e4c06cfe3490c730a"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.916462 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" event={"ID":"473e9670-e72d-4e54-8b06-9d73666cbfc0","Type":"ContainerStarted","Data":"54805991f268c147286cdf3fd84200988b2f73d6e5c9a32d2265cfaffadadf55"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.918254 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" event={"ID":"6647190b-c26b-4c57-bc84-7e5cfe6a5649","Type":"ContainerStarted","Data":"bb1f09bed63e2ba5f613c386fe274fb04b418e8bcd8a35626d9cc665ef5e9f74"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.921613 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" event={"ID":"123b9f81-d315-44b3-a6ec-d777cc18ab7b","Type":"ContainerStarted","Data":"348e49faa0444cc0948dbd3e24e8f321a497f7135585e0eaa3920dc9ff1d867e"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.925436 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" event={"ID":"ae2caf34-b7b2-486c-9e9e-a48cf04eed87","Type":"ContainerStarted","Data":"0a3b3bec382ebf1d4d2532ab7f26131531be098cf0963753f16c4374440c8dcb"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.931178 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" event={"ID":"4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d","Type":"ContainerStarted","Data":"8c6e75decfd3704ce071fe940e5b458a931547906f32072f7dafe777c13748bd"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.932294 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" event={"ID":"c3466125-06fe-4c5d-872d-a778806a0e23","Type":"ContainerStarted","Data":"7da1864dcb01c80768a61f800f7266ab864e30684f4ad0d283650f8c658e892e"} Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.936816 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" podUID="ae2caf34-b7b2-486c-9e9e-a48cf04eed87" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.936881 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" event={"ID":"047a8026-b206-4eb6-9630-3b550af68d3a","Type":"ContainerStarted","Data":"aa672c2d0bee9693b337f333392a84e0671e9d2fdde19afc22aeb22059468f78"} Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.937027 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" podUID="4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d" Mar 19 10:39:29 crc kubenswrapper[4765]: E0319 10:39:29.937023 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" podUID="c3466125-06fe-4c5d-872d-a778806a0e23" Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.952013 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" event={"ID":"adc31858-63eb-4d03-b79c-c1a4054725af","Type":"ContainerStarted","Data":"7eaaabc39e608d22001be7c70679db5ffea1a17633dccb22e7206c6a2b6fdd49"} Mar 19 10:39:29 crc kubenswrapper[4765]: I0319 10:39:29.975032 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs"] Mar 19 10:39:29 crc kubenswrapper[4765]: W0319 10:39:29.992645 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod408f748b_ca2b_4ae8_8994_63d7da422df9.slice/crio-f20f868be3865a6e320d6d621545e02971bc5ea68cbcd771e6c01c5ba3cd4443 WatchSource:0}: Error finding container f20f868be3865a6e320d6d621545e02971bc5ea68cbcd771e6c01c5ba3cd4443: Status 404 returned error can't find the container with id f20f868be3865a6e320d6d621545e02971bc5ea68cbcd771e6c01c5ba3cd4443 Mar 19 10:39:30 crc kubenswrapper[4765]: I0319 10:39:30.268586 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:30 crc kubenswrapper[4765]: E0319 10:39:30.269035 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:30 crc kubenswrapper[4765]: E0319 10:39:30.269386 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert podName:981806c8-2390-44ac-a6f8-81c5f5bb0374 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:32.269357641 +0000 UTC m=+1070.618303383 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-vqghn" (UID: "981806c8-2390-44ac-a6f8-81c5f5bb0374") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:30 crc kubenswrapper[4765]: I0319 10:39:30.473736 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:30 crc kubenswrapper[4765]: I0319 10:39:30.473842 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:30 crc kubenswrapper[4765]: E0319 10:39:30.473981 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 10:39:30 crc kubenswrapper[4765]: E0319 10:39:30.474060 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:32.474040813 +0000 UTC m=+1070.822986415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "webhook-server-cert" not found Mar 19 10:39:30 crc kubenswrapper[4765]: E0319 10:39:30.474500 4765 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 10:39:30 crc kubenswrapper[4765]: E0319 10:39:30.474529 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:32.474519616 +0000 UTC m=+1070.823465158 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "metrics-server-cert" not found Mar 19 10:39:30 crc kubenswrapper[4765]: I0319 10:39:30.993018 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" event={"ID":"e843ba99-8859-41d7-9142-ed9227b4d8e1","Type":"ContainerStarted","Data":"7415af732225467c695cfc899093c71164cba12a8d1606924a1b8e148d5605ff"} Mar 19 10:39:30 crc kubenswrapper[4765]: E0319 10:39:30.999193 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" podUID="e843ba99-8859-41d7-9142-ed9227b4d8e1" Mar 19 10:39:30 crc kubenswrapper[4765]: I0319 10:39:30.999876 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" event={"ID":"408f748b-ca2b-4ae8-8994-63d7da422df9","Type":"ContainerStarted","Data":"f20f868be3865a6e320d6d621545e02971bc5ea68cbcd771e6c01c5ba3cd4443"} Mar 19 10:39:31 crc kubenswrapper[4765]: E0319 10:39:31.001281 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" podUID="3c3ef321-6a40-4d2e-a414-ad6a65cd32cf" Mar 19 10:39:31 crc kubenswrapper[4765]: E0319 10:39:31.001277 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/swift-operator@sha256:866844c5b88e1e0518ceb7490cac9d093da3fb8b2f27ba7bd9bd89f946b9ee6e\\\"\"" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" podUID="c3466125-06fe-4c5d-872d-a778806a0e23" Mar 19 10:39:31 crc kubenswrapper[4765]: E0319 10:39:31.001730 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" podUID="412ddd32-a861-4cec-8d5e-bb21069835e9" Mar 19 10:39:31 crc kubenswrapper[4765]: E0319 10:39:31.001857 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" podUID="ae2caf34-b7b2-486c-9e9e-a48cf04eed87" Mar 19 10:39:31 crc kubenswrapper[4765]: E0319 10:39:31.003044 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/placement-operator@sha256:c8743a6661d118b0e5ba3eb110643358a8a3237dc75984a8f9829880b55a1622\\\"\"" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" podUID="4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d" Mar 19 10:39:31 crc kubenswrapper[4765]: I0319 10:39:31.656663 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:39:31 crc kubenswrapper[4765]: I0319 10:39:31.656742 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:39:31 crc kubenswrapper[4765]: I0319 10:39:31.896635 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:31 crc kubenswrapper[4765]: E0319 10:39:31.897073 4765 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:31 crc kubenswrapper[4765]: E0319 10:39:31.897246 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert podName:0cd862fe-c896-4fa6-a9ba-b1af6441f777 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:35.897197152 +0000 UTC m=+1074.246142874 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert") pod "infra-operator-controller-manager-7b9c774f96-2gnht" (UID: "0cd862fe-c896-4fa6-a9ba-b1af6441f777") : secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.012943 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/test-operator@sha256:43bd420bc05b4789243740bc75f61e10c7aac7883fc2f82b2d4d50085bc96c42\\\"\"" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" podUID="e843ba99-8859-41d7-9142-ed9227b4d8e1" Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.013195 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/watcher-operator@sha256:d9c55e8c6304a0e32289b5e8c69a87ea59b9968918a5c85b7c384633df82c807\\\"\"" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" podUID="3c3ef321-6a40-4d2e-a414-ad6a65cd32cf" Mar 19 10:39:32 crc kubenswrapper[4765]: I0319 10:39:32.302659 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.302861 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.302948 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert podName:981806c8-2390-44ac-a6f8-81c5f5bb0374 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:36.302929366 +0000 UTC m=+1074.651874908 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-vqghn" (UID: "981806c8-2390-44ac-a6f8-81c5f5bb0374") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:32 crc kubenswrapper[4765]: I0319 10:39:32.506809 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.507089 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.507540 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:36.507512694 +0000 UTC m=+1074.856458236 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "webhook-server-cert" not found Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.507574 4765 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 10:39:32 crc kubenswrapper[4765]: E0319 10:39:32.507646 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:36.507627277 +0000 UTC m=+1074.856572819 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "metrics-server-cert" not found Mar 19 10:39:32 crc kubenswrapper[4765]: I0319 10:39:32.507450 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:35 crc kubenswrapper[4765]: I0319 10:39:35.963834 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:35 crc kubenswrapper[4765]: E0319 10:39:35.964060 4765 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:35 crc kubenswrapper[4765]: E0319 10:39:35.964433 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert podName:0cd862fe-c896-4fa6-a9ba-b1af6441f777 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:43.964415662 +0000 UTC m=+1082.313361204 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert") pod "infra-operator-controller-manager-7b9c774f96-2gnht" (UID: "0cd862fe-c896-4fa6-a9ba-b1af6441f777") : secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:36 crc kubenswrapper[4765]: I0319 10:39:36.370075 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:36 crc kubenswrapper[4765]: E0319 10:39:36.370274 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:36 crc kubenswrapper[4765]: E0319 10:39:36.370368 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert podName:981806c8-2390-44ac-a6f8-81c5f5bb0374 nodeName:}" failed. No retries permitted until 2026-03-19 10:39:44.370341762 +0000 UTC m=+1082.719287304 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-vqghn" (UID: "981806c8-2390-44ac-a6f8-81c5f5bb0374") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:36 crc kubenswrapper[4765]: I0319 10:39:36.575702 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:36 crc kubenswrapper[4765]: I0319 10:39:36.575829 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:36 crc kubenswrapper[4765]: E0319 10:39:36.575924 4765 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 10:39:36 crc kubenswrapper[4765]: E0319 10:39:36.576029 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 10:39:36 crc kubenswrapper[4765]: E0319 10:39:36.576091 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:44.576057041 +0000 UTC m=+1082.925002583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "metrics-server-cert" not found Mar 19 10:39:36 crc kubenswrapper[4765]: E0319 10:39:36.576127 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:39:44.576111053 +0000 UTC m=+1082.925056825 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "webhook-server-cert" not found Mar 19 10:39:42 crc kubenswrapper[4765]: E0319 10:39:42.106698 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777" Mar 19 10:39:42 crc kubenswrapper[4765]: E0319 10:39:42.107243 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-gsr8d,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-ncc44_openstack-operators(1d54708d-8829-411d-a632-ce3b53b7aeaa): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:39:42 crc kubenswrapper[4765]: E0319 10:39:42.108561 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" podUID="1d54708d-8829-411d-a632-ce3b53b7aeaa" Mar 19 10:39:42 crc kubenswrapper[4765]: E0319 10:39:42.752478 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a" Mar 19 10:39:42 crc kubenswrapper[4765]: E0319 10:39:42.752676 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-6wdfh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod octavia-operator-controller-manager-5b9f45d989-b4kfn_openstack-operators(473e9670-e72d-4e54-8b06-9d73666cbfc0): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:39:42 crc kubenswrapper[4765]: E0319 10:39:42.753864 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" podUID="473e9670-e72d-4e54-8b06-9d73666cbfc0" Mar 19 10:39:43 crc kubenswrapper[4765]: E0319 10:39:43.105691 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/octavia-operator@sha256:425fd66675becbe0ca2b2fe1a5a6694ac6e0b1cdce9a77a7a37f99785eadc74a\\\"\"" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" podUID="473e9670-e72d-4e54-8b06-9d73666cbfc0" Mar 19 10:39:43 crc kubenswrapper[4765]: E0319 10:39:43.105867 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" podUID="1d54708d-8829-411d-a632-ce3b53b7aeaa" Mar 19 10:39:44 crc kubenswrapper[4765]: I0319 10:39:44.011399 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.011594 4765 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.011690 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert podName:0cd862fe-c896-4fa6-a9ba-b1af6441f777 nodeName:}" failed. No retries permitted until 2026-03-19 10:40:00.011668811 +0000 UTC m=+1098.360614353 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert") pod "infra-operator-controller-manager-7b9c774f96-2gnht" (UID: "0cd862fe-c896-4fa6-a9ba-b1af6441f777") : secret "infra-operator-webhook-server-cert" not found Mar 19 10:39:44 crc kubenswrapper[4765]: I0319 10:39:44.417406 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.417674 4765 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.417833 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert podName:981806c8-2390-44ac-a6f8-81c5f5bb0374 nodeName:}" failed. No retries permitted until 2026-03-19 10:40:00.417796306 +0000 UTC m=+1098.766742028 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-vqghn" (UID: "981806c8-2390-44ac-a6f8-81c5f5bb0374") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 19 10:39:44 crc kubenswrapper[4765]: I0319 10:39:44.621251 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:44 crc kubenswrapper[4765]: I0319 10:39:44.621377 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.621517 4765 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.621639 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:40:00.621611024 +0000 UTC m=+1098.970556556 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "metrics-server-cert" not found Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.621532 4765 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 19 10:39:44 crc kubenswrapper[4765]: E0319 10:39:44.621728 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs podName:970bc693-0463-4dfe-8870-fac695fffcae nodeName:}" failed. No retries permitted until 2026-03-19 10:40:00.621708657 +0000 UTC m=+1098.970654199 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs") pod "openstack-operator-controller-manager-c9c9c96bc-4hzdg" (UID: "970bc693-0463-4dfe-8870-fac695fffcae") : secret "webhook-server-cert" not found Mar 19 10:39:50 crc kubenswrapper[4765]: E0319 10:39:50.639994 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 19 10:39:50 crc kubenswrapper[4765]: E0319 10:39:50.640727 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zrs77,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-wxsgc_openstack-operators(6647190b-c26b-4c57-bc84-7e5cfe6a5649): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:39:50 crc kubenswrapper[4765]: E0319 10:39:50.641798 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" podUID="6647190b-c26b-4c57-bc84-7e5cfe6a5649" Mar 19 10:39:51 crc kubenswrapper[4765]: E0319 10:39:51.162706 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" podUID="6647190b-c26b-4c57-bc84-7e5cfe6a5649" Mar 19 10:39:51 crc kubenswrapper[4765]: E0319 10:39:51.207887 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2" Mar 19 10:39:51 crc kubenswrapper[4765]: E0319 10:39:51.208124 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:operator,Image:quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2,Command:[/manager],Args:[],WorkingDir:,Ports:[]ContainerPort{ContainerPort{Name:metrics,HostPort:0,ContainerPort:9782,Protocol:TCP,HostIP:,},},Env:[]EnvVar{EnvVar{Name:OPERATOR_NAMESPACE,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:metadata.namespace,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{200 -3} {} 200m DecimalSI},memory: {{524288000 0} {} 500Mi BinarySI},},Requests:ResourceList{cpu: {{5 -3} {} 5m DecimalSI},memory: {{67108864 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-8h52m,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000660000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod rabbitmq-cluster-operator-manager-668c99d594-hsxfs_openstack-operators(408f748b-ca2b-4ae8-8994-63d7da422df9): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:39:51 crc kubenswrapper[4765]: E0319 10:39:51.209307 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" podUID="408f748b-ca2b-4ae8-8994-63d7da422df9" Mar 19 10:39:52 crc kubenswrapper[4765]: E0319 10:39:52.170473 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"operator\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/rabbitmq-cluster-operator@sha256:893e66303c1b0bc1d00a299a3f0380bad55c8dc813c8a1c6a4aab379f5aa12a2\\\"\"" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" podUID="408f748b-ca2b-4ae8-8994-63d7da422df9" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.198946 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" event={"ID":"c3466125-06fe-4c5d-872d-a778806a0e23","Type":"ContainerStarted","Data":"def3c6d5e2a2bff96be574030be294af0f0dbab55b253e11de38fcd64b080795"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.201168 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.202513 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" event={"ID":"cdeba207-ced7-4575-9c08-c001d85b0a93","Type":"ContainerStarted","Data":"65c762b9a2a4c3fbe876943a4b25b12bc453828e96246aef7b83c3c1753735f3"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.202857 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.204138 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" event={"ID":"1d54708d-8829-411d-a632-ce3b53b7aeaa","Type":"ContainerStarted","Data":"879c86249296e5298eb5b4f23c3815e1404e8a0117a4ec48232fa3a88cdd24f9"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.204664 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.205933 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" event={"ID":"4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d","Type":"ContainerStarted","Data":"067eb1d95a2f104217879f8e5a077dcacbe853616c6bcec4cd1e5c3559d961c9"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.206545 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.207090 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" event={"ID":"3e1ee5ea-abd4-4a73-840e-43fbd3732cfd","Type":"ContainerStarted","Data":"2a8ecedd7f4c6256970b33852245b3147870097bf3aaaf05fe9398ab16821e6f"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.207473 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.208383 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" event={"ID":"047a8026-b206-4eb6-9630-3b550af68d3a","Type":"ContainerStarted","Data":"f8e18e64be51cdf8d75a5b4f7fa9fb79bdcfc2ba5350e47c6bd49ced126039a6"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.209632 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" event={"ID":"adc31858-63eb-4d03-b79c-c1a4054725af","Type":"ContainerStarted","Data":"d9c6501024475b95a15d751117c033b6bfcd05ba11bc61a0055ddc2dac7fd694"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.210081 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.212552 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" event={"ID":"b94397e1-cedc-4048-9253-12c60b0a9bfd","Type":"ContainerStarted","Data":"83f80cf7b01a714d1e2980ef04e34e12d064c5c1507642efbf53452df1b7be1e"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.212592 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.213365 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" event={"ID":"412ddd32-a861-4cec-8d5e-bb21069835e9","Type":"ContainerStarted","Data":"e48eea652c9463b21c4ba132f792dcf8433d2fb759148655e1f975d70a2ade1a"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.213632 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.215221 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" event={"ID":"9225dfe1-877e-43a2-9034-0e355019aa04","Type":"ContainerStarted","Data":"a38dc92fbae28a16d94c65c964c092e69dd19f34e3849c3a514535cd1a0109cf"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.215403 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.216754 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" event={"ID":"ae2caf34-b7b2-486c-9e9e-a48cf04eed87","Type":"ContainerStarted","Data":"0858192c5acedc319261e5e172e226e13a7df009ab5448e3cb9fe2aae84731fb"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.216952 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.218353 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" event={"ID":"e843ba99-8859-41d7-9142-ed9227b4d8e1","Type":"ContainerStarted","Data":"f2e7e4427bfee2666075ff23e9e2cd191e9c2f44104c9ac57ea893c1380f3fc9"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.218767 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.220784 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" event={"ID":"d9dca6f4-a577-44fa-a959-8398fb57dca0","Type":"ContainerStarted","Data":"64137bf803504895f14d6dc34ade30ba00b732d10af602782a73c0c59d1dee62"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.221203 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.222361 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" event={"ID":"3c3ef321-6a40-4d2e-a414-ad6a65cd32cf","Type":"ContainerStarted","Data":"a6387f5fc4fed1d67693ee6958ba34df3f717ff42c2e52d436cf9986e4ddd6b9"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.222746 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.224020 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" event={"ID":"1954f819-78c2-46fd-a6bf-c626d50ef527","Type":"ContainerStarted","Data":"f035dd4f1a922a59da77b22787d260a4cbeccaa99848699efc1e10f6fd3e8cad"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.224455 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.226183 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" event={"ID":"c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01","Type":"ContainerStarted","Data":"997ba4d63f24e5b47093dc7afbe614e3cf43df320e5c201d9779bfea1b7e8bc4"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.226684 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.228519 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" event={"ID":"123b9f81-d315-44b3-a6ec-d777cc18ab7b","Type":"ContainerStarted","Data":"2755aedc753f646a51f59758ecde423f07ffaaa6a75e9c980b931868c4bdc87b"} Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.228919 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.645463 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" podStartSLOduration=3.507825641 podStartE2EDuration="28.645425902s" podCreationTimestamp="2026-03-19 10:39:28 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.757989162 +0000 UTC m=+1068.106934704" lastFinishedPulling="2026-03-19 10:39:54.895589413 +0000 UTC m=+1093.244534965" observedRunningTime="2026-03-19 10:39:56.385139933 +0000 UTC m=+1094.734085475" watchObservedRunningTime="2026-03-19 10:39:56.645425902 +0000 UTC m=+1094.994371444" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.775003 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" podStartSLOduration=3.7275894210000002 podStartE2EDuration="28.774973356s" podCreationTimestamp="2026-03-19 10:39:28 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.862213668 +0000 UTC m=+1068.211159210" lastFinishedPulling="2026-03-19 10:39:54.909597603 +0000 UTC m=+1093.258543145" observedRunningTime="2026-03-19 10:39:56.650952322 +0000 UTC m=+1094.999897874" watchObservedRunningTime="2026-03-19 10:39:56.774973356 +0000 UTC m=+1095.123918898" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.850714 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" podStartSLOduration=4.679781845 podStartE2EDuration="29.85069773s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.754304142 +0000 UTC m=+1068.103249684" lastFinishedPulling="2026-03-19 10:39:54.925220027 +0000 UTC m=+1093.274165569" observedRunningTime="2026-03-19 10:39:56.850155725 +0000 UTC m=+1095.199101277" watchObservedRunningTime="2026-03-19 10:39:56.85069773 +0000 UTC m=+1095.199643272" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.853510 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" podStartSLOduration=7.983148679 podStartE2EDuration="29.853503496s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.316265081 +0000 UTC m=+1067.665210623" lastFinishedPulling="2026-03-19 10:39:51.186619898 +0000 UTC m=+1089.535565440" observedRunningTime="2026-03-19 10:39:56.783790215 +0000 UTC m=+1095.132735757" watchObservedRunningTime="2026-03-19 10:39:56.853503496 +0000 UTC m=+1095.202449038" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.927759 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" podStartSLOduration=4.331840988 podStartE2EDuration="29.927738519s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.324113904 +0000 UTC m=+1067.673059446" lastFinishedPulling="2026-03-19 10:39:54.920011435 +0000 UTC m=+1093.268956977" observedRunningTime="2026-03-19 10:39:56.926209748 +0000 UTC m=+1095.275155290" watchObservedRunningTime="2026-03-19 10:39:56.927738519 +0000 UTC m=+1095.276684061" Mar 19 10:39:56 crc kubenswrapper[4765]: I0319 10:39:56.966491 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" podStartSLOduration=8.080744476 podStartE2EDuration="29.96646681s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.301980564 +0000 UTC m=+1067.650926106" lastFinishedPulling="2026-03-19 10:39:51.187702898 +0000 UTC m=+1089.536648440" observedRunningTime="2026-03-19 10:39:56.963472058 +0000 UTC m=+1095.312417600" watchObservedRunningTime="2026-03-19 10:39:56.96646681 +0000 UTC m=+1095.315412352" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.132253 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" podStartSLOduration=4.088901591 podStartE2EDuration="29.132236596s" podCreationTimestamp="2026-03-19 10:39:28 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.85380176 +0000 UTC m=+1068.202747302" lastFinishedPulling="2026-03-19 10:39:54.897136765 +0000 UTC m=+1093.246082307" observedRunningTime="2026-03-19 10:39:57.127839966 +0000 UTC m=+1095.476785528" watchObservedRunningTime="2026-03-19 10:39:57.132236596 +0000 UTC m=+1095.481182138" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.133194 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" podStartSLOduration=8.57091968 podStartE2EDuration="30.133187581s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.624712037 +0000 UTC m=+1067.973657579" lastFinishedPulling="2026-03-19 10:39:51.186979938 +0000 UTC m=+1089.535925480" observedRunningTime="2026-03-19 10:39:57.085065146 +0000 UTC m=+1095.434010698" watchObservedRunningTime="2026-03-19 10:39:57.133187581 +0000 UTC m=+1095.482133123" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.160284 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" podStartSLOduration=8.607672947 podStartE2EDuration="30.160249735s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.635652614 +0000 UTC m=+1067.984598156" lastFinishedPulling="2026-03-19 10:39:51.188229402 +0000 UTC m=+1089.537174944" observedRunningTime="2026-03-19 10:39:57.153859742 +0000 UTC m=+1095.502805284" watchObservedRunningTime="2026-03-19 10:39:57.160249735 +0000 UTC m=+1095.509195297" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.172353 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" podStartSLOduration=6.801808518 podStartE2EDuration="30.172329343s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.16322466 +0000 UTC m=+1067.512170202" lastFinishedPulling="2026-03-19 10:39:52.533745485 +0000 UTC m=+1090.882691027" observedRunningTime="2026-03-19 10:39:57.169338072 +0000 UTC m=+1095.518283614" watchObservedRunningTime="2026-03-19 10:39:57.172329343 +0000 UTC m=+1095.521274885" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.217031 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" podStartSLOduration=4.215194747 podStartE2EDuration="29.217004705s" podCreationTimestamp="2026-03-19 10:39:28 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.873216307 +0000 UTC m=+1068.222161849" lastFinishedPulling="2026-03-19 10:39:54.875026255 +0000 UTC m=+1093.223971807" observedRunningTime="2026-03-19 10:39:57.211756392 +0000 UTC m=+1095.560701934" watchObservedRunningTime="2026-03-19 10:39:57.217004705 +0000 UTC m=+1095.565950247" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.246778 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" event={"ID":"473e9670-e72d-4e54-8b06-9d73666cbfc0","Type":"ContainerStarted","Data":"fd470f5e24627e89876f7cda0065a1c39f99deedbb3c7c824aa0bab3ae4d1c72"} Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.247359 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.250640 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.287683 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" podStartSLOduration=7.38404061 podStartE2EDuration="30.287652031s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.629031154 +0000 UTC m=+1067.977976696" lastFinishedPulling="2026-03-19 10:39:52.532642575 +0000 UTC m=+1090.881588117" observedRunningTime="2026-03-19 10:39:57.258282284 +0000 UTC m=+1095.607227846" watchObservedRunningTime="2026-03-19 10:39:57.287652031 +0000 UTC m=+1095.636597573" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.314205 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" podStartSLOduration=7.20265473 podStartE2EDuration="30.31418504s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.4227651 +0000 UTC m=+1067.771710642" lastFinishedPulling="2026-03-19 10:39:52.53429541 +0000 UTC m=+1090.883240952" observedRunningTime="2026-03-19 10:39:57.310584043 +0000 UTC m=+1095.659529605" watchObservedRunningTime="2026-03-19 10:39:57.31418504 +0000 UTC m=+1095.663130572" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.367290 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" podStartSLOduration=6.911814391 podStartE2EDuration="30.36726924s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.645454579 +0000 UTC m=+1067.994400121" lastFinishedPulling="2026-03-19 10:39:53.100909428 +0000 UTC m=+1091.449854970" observedRunningTime="2026-03-19 10:39:57.348521712 +0000 UTC m=+1095.697467264" watchObservedRunningTime="2026-03-19 10:39:57.36726924 +0000 UTC m=+1095.716214782" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.414385 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" podStartSLOduration=8.409148003 podStartE2EDuration="30.414369458s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.181392773 +0000 UTC m=+1067.530338315" lastFinishedPulling="2026-03-19 10:39:51.186614228 +0000 UTC m=+1089.535559770" observedRunningTime="2026-03-19 10:39:57.373126079 +0000 UTC m=+1095.722071631" watchObservedRunningTime="2026-03-19 10:39:57.414369458 +0000 UTC m=+1095.763315000" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.416620 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" podStartSLOduration=8.890878809 podStartE2EDuration="30.416614189s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.660874208 +0000 UTC m=+1068.009819750" lastFinishedPulling="2026-03-19 10:39:51.186609598 +0000 UTC m=+1089.535555130" observedRunningTime="2026-03-19 10:39:57.413193066 +0000 UTC m=+1095.762138608" watchObservedRunningTime="2026-03-19 10:39:57.416614189 +0000 UTC m=+1095.765559731" Mar 19 10:39:57 crc kubenswrapper[4765]: I0319 10:39:57.447686 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" podStartSLOduration=4.337201595 podStartE2EDuration="29.44766329s" podCreationTimestamp="2026-03-19 10:39:28 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.750811017 +0000 UTC m=+1068.099756559" lastFinishedPulling="2026-03-19 10:39:54.861272702 +0000 UTC m=+1093.210218254" observedRunningTime="2026-03-19 10:39:57.442657424 +0000 UTC m=+1095.791602966" watchObservedRunningTime="2026-03-19 10:39:57.44766329 +0000 UTC m=+1095.796608832" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.101870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.108330 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0cd862fe-c896-4fa6-a9ba-b1af6441f777-cert\") pod \"infra-operator-controller-manager-7b9c774f96-2gnht\" (UID: \"0cd862fe-c896-4fa6-a9ba-b1af6441f777\") " pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.132083 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" podStartSLOduration=6.176203301 podStartE2EDuration="33.132054367s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.847230912 +0000 UTC m=+1068.196176454" lastFinishedPulling="2026-03-19 10:39:56.803081978 +0000 UTC m=+1095.152027520" observedRunningTime="2026-03-19 10:39:57.477221811 +0000 UTC m=+1095.826167353" watchObservedRunningTime="2026-03-19 10:40:00.132054367 +0000 UTC m=+1098.480999919" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.140045 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565280-nf5hg"] Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.142764 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-nf5hg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.148042 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.148060 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.148793 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.151253 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-nf5hg"] Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.291677 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-nmxkk" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.299798 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.306104 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tvqc\" (UniqueName: \"kubernetes.io/projected/823c3d08-1e08-4fba-b6c7-8591036f93bc-kube-api-access-7tvqc\") pod \"auto-csr-approver-29565280-nf5hg\" (UID: \"823c3d08-1e08-4fba-b6c7-8591036f93bc\") " pod="openshift-infra/auto-csr-approver-29565280-nf5hg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.407543 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tvqc\" (UniqueName: \"kubernetes.io/projected/823c3d08-1e08-4fba-b6c7-8591036f93bc-kube-api-access-7tvqc\") pod \"auto-csr-approver-29565280-nf5hg\" (UID: \"823c3d08-1e08-4fba-b6c7-8591036f93bc\") " pod="openshift-infra/auto-csr-approver-29565280-nf5hg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.438759 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tvqc\" (UniqueName: \"kubernetes.io/projected/823c3d08-1e08-4fba-b6c7-8591036f93bc-kube-api-access-7tvqc\") pod \"auto-csr-approver-29565280-nf5hg\" (UID: \"823c3d08-1e08-4fba-b6c7-8591036f93bc\") " pod="openshift-infra/auto-csr-approver-29565280-nf5hg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.474697 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-nf5hg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.509194 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.514636 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/981806c8-2390-44ac-a6f8-81c5f5bb0374-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-vqghn\" (UID: \"981806c8-2390-44ac-a6f8-81c5f5bb0374\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.566077 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-xtgsp" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.573260 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.713784 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.713882 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.718228 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-metrics-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.721095 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/970bc693-0463-4dfe-8870-fac695fffcae-webhook-certs\") pod \"openstack-operator-controller-manager-c9c9c96bc-4hzdg\" (UID: \"970bc693-0463-4dfe-8870-fac695fffcae\") " pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.792677 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht"] Mar 19 10:40:00 crc kubenswrapper[4765]: W0319 10:40:00.804900 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0cd862fe_c896_4fa6_a9ba_b1af6441f777.slice/crio-d4c75c4aad82f8f148fbd8523f52400e78dddd4072adea91bdeb63e7f61c2e0f WatchSource:0}: Error finding container d4c75c4aad82f8f148fbd8523f52400e78dddd4072adea91bdeb63e7f61c2e0f: Status 404 returned error can't find the container with id d4c75c4aad82f8f148fbd8523f52400e78dddd4072adea91bdeb63e7f61c2e0f Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.873687 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-8vsps" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.882104 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:40:00 crc kubenswrapper[4765]: I0319 10:40:00.899654 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-nf5hg"] Mar 19 10:40:00 crc kubenswrapper[4765]: W0319 10:40:00.903855 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod823c3d08_1e08_4fba_b6c7_8591036f93bc.slice/crio-0f79e06570ee26ff269422057fa7b82184e5e2b2c83b0d677eec1e386edb76c2 WatchSource:0}: Error finding container 0f79e06570ee26ff269422057fa7b82184e5e2b2c83b0d677eec1e386edb76c2: Status 404 returned error can't find the container with id 0f79e06570ee26ff269422057fa7b82184e5e2b2c83b0d677eec1e386edb76c2 Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.005946 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn"] Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.128540 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg"] Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.278131 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" event={"ID":"981806c8-2390-44ac-a6f8-81c5f5bb0374","Type":"ContainerStarted","Data":"741dda5d7de90b2f3534b2bc354c28a7e3a045ebdc49e4f0695393b75c986d0f"} Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.280256 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" event={"ID":"970bc693-0463-4dfe-8870-fac695fffcae","Type":"ContainerStarted","Data":"dbcfe2e2246f65e49bc579bbae6621055a61a70976ad5debc7572c7c13427dab"} Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.281471 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565280-nf5hg" event={"ID":"823c3d08-1e08-4fba-b6c7-8591036f93bc","Type":"ContainerStarted","Data":"0f79e06570ee26ff269422057fa7b82184e5e2b2c83b0d677eec1e386edb76c2"} Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.283221 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" event={"ID":"0cd862fe-c896-4fa6-a9ba-b1af6441f777","Type":"ContainerStarted","Data":"d4c75c4aad82f8f148fbd8523f52400e78dddd4072adea91bdeb63e7f61c2e0f"} Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.655686 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.656064 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.656110 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.656721 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"74ef4f9e7cb23afbd3cc2c57d6c7b62007d3fc20daf2ec79338ae2ff820f9dfb"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:40:01 crc kubenswrapper[4765]: I0319 10:40:01.656776 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://74ef4f9e7cb23afbd3cc2c57d6c7b62007d3fc20daf2ec79338ae2ff820f9dfb" gracePeriod=600 Mar 19 10:40:02 crc kubenswrapper[4765]: I0319 10:40:02.292861 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" event={"ID":"970bc693-0463-4dfe-8870-fac695fffcae","Type":"ContainerStarted","Data":"e07cdaba060dde8834686527722c08b7edf3d9d9367c9b64f86f81da454b0ad3"} Mar 19 10:40:02 crc kubenswrapper[4765]: I0319 10:40:02.294215 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:40:02 crc kubenswrapper[4765]: I0319 10:40:02.304842 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="74ef4f9e7cb23afbd3cc2c57d6c7b62007d3fc20daf2ec79338ae2ff820f9dfb" exitCode=0 Mar 19 10:40:02 crc kubenswrapper[4765]: I0319 10:40:02.304915 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"74ef4f9e7cb23afbd3cc2c57d6c7b62007d3fc20daf2ec79338ae2ff820f9dfb"} Mar 19 10:40:02 crc kubenswrapper[4765]: I0319 10:40:02.304979 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"f15a48ff831f92a999a822adb51bf1e4ef1ab9b4cad221adcbd0787b32c65b85"} Mar 19 10:40:02 crc kubenswrapper[4765]: I0319 10:40:02.305004 4765 scope.go:117] "RemoveContainer" containerID="c126315de99fbe26aafdf378053a0eb9d09d2fb8e735089e0f39caceb743cb3e" Mar 19 10:40:02 crc kubenswrapper[4765]: I0319 10:40:02.334650 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" podStartSLOduration=34.334622994 podStartE2EDuration="34.334622994s" podCreationTimestamp="2026-03-19 10:39:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:40:02.324441248 +0000 UTC m=+1100.673386790" watchObservedRunningTime="2026-03-19 10:40:02.334622994 +0000 UTC m=+1100.683568536" Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.326174 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" event={"ID":"6647190b-c26b-4c57-bc84-7e5cfe6a5649","Type":"ContainerStarted","Data":"cfda68fc510338ee9f354fd646dd198239afc005cf341de000547fce3ce2cf61"} Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.327161 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.328152 4765 generic.go:334] "Generic (PLEG): container finished" podID="823c3d08-1e08-4fba-b6c7-8591036f93bc" containerID="1dbfa468f3f14bdc0b4bd4795787e07bcfc06b131f93eb6364d7b257f6bd081e" exitCode=0 Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.328224 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565280-nf5hg" event={"ID":"823c3d08-1e08-4fba-b6c7-8591036f93bc","Type":"ContainerDied","Data":"1dbfa468f3f14bdc0b4bd4795787e07bcfc06b131f93eb6364d7b257f6bd081e"} Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.330075 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" event={"ID":"0cd862fe-c896-4fa6-a9ba-b1af6441f777","Type":"ContainerStarted","Data":"430281ff986539e6b9b323a1e746c926863147293640c79d8aa8aa4c52744e43"} Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.330579 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.333047 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" event={"ID":"981806c8-2390-44ac-a6f8-81c5f5bb0374","Type":"ContainerStarted","Data":"7bc52c3459611de41d90266f940bbbbc5147b7db07e733e63c3d5c37ddae887c"} Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.333173 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.345076 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" podStartSLOduration=3.244762954 podStartE2EDuration="37.345059442s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:39:29.735488731 +0000 UTC m=+1068.084434273" lastFinishedPulling="2026-03-19 10:40:03.835785219 +0000 UTC m=+1102.184730761" observedRunningTime="2026-03-19 10:40:04.344007583 +0000 UTC m=+1102.692953135" watchObservedRunningTime="2026-03-19 10:40:04.345059442 +0000 UTC m=+1102.694004994" Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.408446 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" podStartSLOduration=34.929055444 podStartE2EDuration="37.40842531s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:40:00.807693211 +0000 UTC m=+1099.156638753" lastFinishedPulling="2026-03-19 10:40:03.287063077 +0000 UTC m=+1101.636008619" observedRunningTime="2026-03-19 10:40:04.375457026 +0000 UTC m=+1102.724402578" watchObservedRunningTime="2026-03-19 10:40:04.40842531 +0000 UTC m=+1102.757370852" Mar 19 10:40:04 crc kubenswrapper[4765]: I0319 10:40:04.410110 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" podStartSLOduration=35.148514687 podStartE2EDuration="37.410097096s" podCreationTimestamp="2026-03-19 10:39:27 +0000 UTC" firstStartedPulling="2026-03-19 10:40:01.025247841 +0000 UTC m=+1099.374193383" lastFinishedPulling="2026-03-19 10:40:03.28683025 +0000 UTC m=+1101.635775792" observedRunningTime="2026-03-19 10:40:04.407877965 +0000 UTC m=+1102.756823517" watchObservedRunningTime="2026-03-19 10:40:04.410097096 +0000 UTC m=+1102.759042638" Mar 19 10:40:05 crc kubenswrapper[4765]: I0319 10:40:05.633320 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-nf5hg" Mar 19 10:40:05 crc kubenswrapper[4765]: I0319 10:40:05.803087 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tvqc\" (UniqueName: \"kubernetes.io/projected/823c3d08-1e08-4fba-b6c7-8591036f93bc-kube-api-access-7tvqc\") pod \"823c3d08-1e08-4fba-b6c7-8591036f93bc\" (UID: \"823c3d08-1e08-4fba-b6c7-8591036f93bc\") " Mar 19 10:40:05 crc kubenswrapper[4765]: I0319 10:40:05.810193 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/823c3d08-1e08-4fba-b6c7-8591036f93bc-kube-api-access-7tvqc" (OuterVolumeSpecName: "kube-api-access-7tvqc") pod "823c3d08-1e08-4fba-b6c7-8591036f93bc" (UID: "823c3d08-1e08-4fba-b6c7-8591036f93bc"). InnerVolumeSpecName "kube-api-access-7tvqc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:40:05 crc kubenswrapper[4765]: I0319 10:40:05.905467 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tvqc\" (UniqueName: \"kubernetes.io/projected/823c3d08-1e08-4fba-b6c7-8591036f93bc-kube-api-access-7tvqc\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:06 crc kubenswrapper[4765]: I0319 10:40:06.348149 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565280-nf5hg" Mar 19 10:40:06 crc kubenswrapper[4765]: I0319 10:40:06.348188 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565280-nf5hg" event={"ID":"823c3d08-1e08-4fba-b6c7-8591036f93bc","Type":"ContainerDied","Data":"0f79e06570ee26ff269422057fa7b82184e5e2b2c83b0d677eec1e386edb76c2"} Mar 19 10:40:06 crc kubenswrapper[4765]: I0319 10:40:06.348647 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f79e06570ee26ff269422057fa7b82184e5e2b2c83b0d677eec1e386edb76c2" Mar 19 10:40:06 crc kubenswrapper[4765]: I0319 10:40:06.699990 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-dcpdf"] Mar 19 10:40:06 crc kubenswrapper[4765]: I0319 10:40:06.705119 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565274-dcpdf"] Mar 19 10:40:07 crc kubenswrapper[4765]: I0319 10:40:07.359076 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" event={"ID":"408f748b-ca2b-4ae8-8994-63d7da422df9","Type":"ContainerStarted","Data":"28437dd377eed1cf1a48cced5718346f8e67dcf0e99ceb249fa25496498dd819"} Mar 19 10:40:07 crc kubenswrapper[4765]: I0319 10:40:07.381170 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-manager-668c99d594-hsxfs" podStartSLOduration=2.560202739 podStartE2EDuration="39.381150747s" podCreationTimestamp="2026-03-19 10:39:28 +0000 UTC" firstStartedPulling="2026-03-19 10:39:30.012973887 +0000 UTC m=+1068.361919429" lastFinishedPulling="2026-03-19 10:40:06.833921895 +0000 UTC m=+1105.182867437" observedRunningTime="2026-03-19 10:40:07.375042161 +0000 UTC m=+1105.723987713" watchObservedRunningTime="2026-03-19 10:40:07.381150747 +0000 UTC m=+1105.730096279" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.052736 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-kvrz2" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.083535 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-ncc44" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.108433 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-6h4tg" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.176952 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-4zf8v" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.221118 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-qrqf8" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.245254 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-kzvs8" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.263803 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-p27jn" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.366018 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ddd89057-e049-46d9-823f-f38e8297b7fd" path="/var/lib/kubelet/pods/ddd89057-e049-46d9-823f-f38e8297b7fd/volumes" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.531797 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-q9lpg" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.559922 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-bg8b9" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.581662 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-x8k5g" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.685813 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-wxsgc" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.691348 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-78t28" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.706914 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-v78t5" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.830621 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-2zbqz" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.847546 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-hfm25" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.952521 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-7wsnh" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.955887 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-dbv47" Mar 19 10:40:08 crc kubenswrapper[4765]: I0319 10:40:08.985996 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-b4kfn" Mar 19 10:40:09 crc kubenswrapper[4765]: I0319 10:40:09.193255 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-xdlrz" Mar 19 10:40:10 crc kubenswrapper[4765]: I0319 10:40:10.306463 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-7b9c774f96-2gnht" Mar 19 10:40:10 crc kubenswrapper[4765]: I0319 10:40:10.580846 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-vqghn" Mar 19 10:40:10 crc kubenswrapper[4765]: I0319 10:40:10.888368 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-c9c9c96bc-4hzdg" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.849090 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bv86p"] Mar 19 10:40:29 crc kubenswrapper[4765]: E0319 10:40:29.850088 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="823c3d08-1e08-4fba-b6c7-8591036f93bc" containerName="oc" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.850107 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="823c3d08-1e08-4fba-b6c7-8591036f93bc" containerName="oc" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.850302 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="823c3d08-1e08-4fba-b6c7-8591036f93bc" containerName="oc" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.851241 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.854891 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.854921 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.855082 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.857102 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-lvctp" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.866462 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bv86p"] Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.960266 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mh9dh"] Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.961873 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.968892 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mh9dh"] Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.969806 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpr4j\" (UniqueName: \"kubernetes.io/projected/921a9df0-bfdb-42f0-82f6-863f86806dda-kube-api-access-vpr4j\") pod \"dnsmasq-dns-675f4bcbfc-bv86p\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.969986 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a9df0-bfdb-42f0-82f6-863f86806dda-config\") pod \"dnsmasq-dns-675f4bcbfc-bv86p\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:29 crc kubenswrapper[4765]: I0319 10:40:29.970718 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.071107 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9shq\" (UniqueName: \"kubernetes.io/projected/a9774f11-fa76-40b5-8655-f1cd952f9f24-kube-api-access-v9shq\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.071180 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpr4j\" (UniqueName: \"kubernetes.io/projected/921a9df0-bfdb-42f0-82f6-863f86806dda-kube-api-access-vpr4j\") pod \"dnsmasq-dns-675f4bcbfc-bv86p\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.071236 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.071276 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-config\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.071304 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a9df0-bfdb-42f0-82f6-863f86806dda-config\") pod \"dnsmasq-dns-675f4bcbfc-bv86p\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.072602 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a9df0-bfdb-42f0-82f6-863f86806dda-config\") pod \"dnsmasq-dns-675f4bcbfc-bv86p\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.092416 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpr4j\" (UniqueName: \"kubernetes.io/projected/921a9df0-bfdb-42f0-82f6-863f86806dda-kube-api-access-vpr4j\") pod \"dnsmasq-dns-675f4bcbfc-bv86p\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.169263 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.172986 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.173042 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-config\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.173102 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9shq\" (UniqueName: \"kubernetes.io/projected/a9774f11-fa76-40b5-8655-f1cd952f9f24-kube-api-access-v9shq\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.174078 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.174190 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-config\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.202755 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9shq\" (UniqueName: \"kubernetes.io/projected/a9774f11-fa76-40b5-8655-f1cd952f9f24-kube-api-access-v9shq\") pod \"dnsmasq-dns-78dd6ddcc-mh9dh\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.284366 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.587575 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bv86p"] Mar 19 10:40:30 crc kubenswrapper[4765]: I0319 10:40:30.706543 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mh9dh"] Mar 19 10:40:30 crc kubenswrapper[4765]: W0319 10:40:30.709763 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda9774f11_fa76_40b5_8655_f1cd952f9f24.slice/crio-fe4d72db1a91f60a6edc865ed89a069a02db162bca4e9c131943e4d18d17e841 WatchSource:0}: Error finding container fe4d72db1a91f60a6edc865ed89a069a02db162bca4e9c131943e4d18d17e841: Status 404 returned error can't find the container with id fe4d72db1a91f60a6edc865ed89a069a02db162bca4e9c131943e4d18d17e841 Mar 19 10:40:31 crc kubenswrapper[4765]: I0319 10:40:31.545917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" event={"ID":"a9774f11-fa76-40b5-8655-f1cd952f9f24","Type":"ContainerStarted","Data":"fe4d72db1a91f60a6edc865ed89a069a02db162bca4e9c131943e4d18d17e841"} Mar 19 10:40:31 crc kubenswrapper[4765]: I0319 10:40:31.547134 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" event={"ID":"921a9df0-bfdb-42f0-82f6-863f86806dda","Type":"ContainerStarted","Data":"a5c8c44d72373198c5da0e48d97bc37b32f51331976905a8f96dffed23830af5"} Mar 19 10:40:32 crc kubenswrapper[4765]: I0319 10:40:32.819135 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bv86p"] Mar 19 10:40:32 crc kubenswrapper[4765]: I0319 10:40:32.879081 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nq26n"] Mar 19 10:40:32 crc kubenswrapper[4765]: I0319 10:40:32.881011 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:32 crc kubenswrapper[4765]: I0319 10:40:32.894241 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nq26n"] Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.023253 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.023308 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qg8pr\" (UniqueName: \"kubernetes.io/projected/cf28f1ab-e0a6-4481-bb82-4cd47321520a-kube-api-access-qg8pr\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.023376 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-config\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.120544 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mh9dh"] Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.125134 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-config\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.125227 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.125258 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qg8pr\" (UniqueName: \"kubernetes.io/projected/cf28f1ab-e0a6-4481-bb82-4cd47321520a-kube-api-access-qg8pr\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.126750 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-config\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.128517 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-dns-svc\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.152393 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n9ml8"] Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.155217 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.160226 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qg8pr\" (UniqueName: \"kubernetes.io/projected/cf28f1ab-e0a6-4481-bb82-4cd47321520a-kube-api-access-qg8pr\") pod \"dnsmasq-dns-5ccc8479f9-nq26n\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.169846 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n9ml8"] Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.219450 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.330948 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.331356 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-config\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.331477 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4w5vq\" (UniqueName: \"kubernetes.io/projected/1ddad710-e7a8-4593-82db-bddeef4de69e-kube-api-access-4w5vq\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.432938 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.433011 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-config\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.433068 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4w5vq\" (UniqueName: \"kubernetes.io/projected/1ddad710-e7a8-4593-82db-bddeef4de69e-kube-api-access-4w5vq\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.434580 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-config\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.435799 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-dns-svc\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.467289 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4w5vq\" (UniqueName: \"kubernetes.io/projected/1ddad710-e7a8-4593-82db-bddeef4de69e-kube-api-access-4w5vq\") pod \"dnsmasq-dns-57d769cc4f-n9ml8\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.519013 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:40:33 crc kubenswrapper[4765]: I0319 10:40:33.764328 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nq26n"] Mar 19 10:40:33 crc kubenswrapper[4765]: W0319 10:40:33.787257 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcf28f1ab_e0a6_4481_bb82_4cd47321520a.slice/crio-aa249d54c0f68ff5a300b8504492d8418d90cd1cab85af2463da9e71578e9e1e WatchSource:0}: Error finding container aa249d54c0f68ff5a300b8504492d8418d90cd1cab85af2463da9e71578e9e1e: Status 404 returned error can't find the container with id aa249d54c0f68ff5a300b8504492d8418d90cd1cab85af2463da9e71578e9e1e Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.000279 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.001832 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.007210 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.007538 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-spcmw" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.007770 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.007976 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.008043 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.008134 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.008316 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.040167 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n9ml8"] Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.061370 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146479 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146543 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146585 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146611 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146642 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146669 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcfsp\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-kube-api-access-jcfsp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146694 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146716 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ccdb0a31-8b87-4024-848f-efebcf46e604-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146782 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146827 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ccdb0a31-8b87-4024-848f-efebcf46e604-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.146912 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.250754 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcfsp\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-kube-api-access-jcfsp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.250834 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.250879 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ccdb0a31-8b87-4024-848f-efebcf46e604-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251088 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251173 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ccdb0a31-8b87-4024-848f-efebcf46e604-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251244 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251351 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251409 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251461 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251482 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.251533 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.252865 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.252899 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.253300 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.253586 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.253929 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.256308 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.258674 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ccdb0a31-8b87-4024-848f-efebcf46e604-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.259043 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.272496 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.278069 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ccdb0a31-8b87-4024-848f-efebcf46e604-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.287156 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.288388 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.288805 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.294223 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k796b" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.294363 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.294393 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.294568 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.294659 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.294797 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.294855 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.310654 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcfsp\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-kube-api-access-jcfsp\") pod \"rabbitmq-cell1-server-0\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.322184 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.340685 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.462811 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.462871 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2190a046-0d52-49c7-b2fd-aa113c2f3f99-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.462899 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.462925 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8np94\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-kube-api-access-8np94\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.463011 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.463054 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2190a046-0d52-49c7-b2fd-aa113c2f3f99-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.463091 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.463130 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-config-data\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.463164 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.463217 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.463247 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.564853 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.564902 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8np94\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-kube-api-access-8np94\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.564944 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.564993 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2190a046-0d52-49c7-b2fd-aa113c2f3f99-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565028 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565070 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-config-data\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565107 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565161 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565190 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565244 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565269 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2190a046-0d52-49c7-b2fd-aa113c2f3f99-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565410 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565574 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.565680 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.566465 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-config-data\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.567586 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-server-conf\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.569191 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.569412 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.572507 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.573094 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2190a046-0d52-49c7-b2fd-aa113c2f3f99-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.574542 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2190a046-0d52-49c7-b2fd-aa113c2f3f99-pod-info\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.575859 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" event={"ID":"cf28f1ab-e0a6-4481-bb82-4cd47321520a","Type":"ContainerStarted","Data":"aa249d54c0f68ff5a300b8504492d8418d90cd1cab85af2463da9e71578e9e1e"} Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.584449 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8np94\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-kube-api-access-8np94\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.595710 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " pod="openstack/rabbitmq-server-0" Mar 19 10:40:34 crc kubenswrapper[4765]: I0319 10:40:34.675035 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.310442 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-galera-0"] Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.312250 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.340553 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-scripts" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.341343 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config-data" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.341644 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-svc" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.345399 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-dockercfg-x2hxd" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.347182 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.351617 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"combined-ca-bundle" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380481 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-config-data-default\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380571 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380604 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380639 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qvnj5\" (UniqueName: \"kubernetes.io/projected/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-kube-api-access-qvnj5\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380658 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380679 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380727 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-kolla-config\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.380976 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.482813 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qvnj5\" (UniqueName: \"kubernetes.io/projected/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-kube-api-access-qvnj5\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.482875 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.482904 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.483007 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-kolla-config\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.483046 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.483112 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-config-data-default\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.483199 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.483228 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.483826 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") device mount path \"/mnt/openstack/pv01\"" pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.485031 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-config-data-default\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.485391 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-config-data-generated\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.485686 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-kolla-config\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.486291 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-operator-scripts\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.491940 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-galera-tls-certs\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.506654 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qvnj5\" (UniqueName: \"kubernetes.io/projected/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-kube-api-access-qvnj5\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.519687 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/adf887ce-99cf-47a0-89e8-2db5aa92a9ca-combined-ca-bundle\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.522205 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"openstack-galera-0\" (UID: \"adf887ce-99cf-47a0-89e8-2db5aa92a9ca\") " pod="openstack/openstack-galera-0" Mar 19 10:40:35 crc kubenswrapper[4765]: I0319 10:40:35.649866 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.770391 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.772361 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.775736 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"galera-openstack-cell1-dockercfg-xfngh" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.776476 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-scripts" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.776647 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-galera-openstack-cell1-svc" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.776793 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-cell1-config-data" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.794210 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.809857 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85cd8112-bca8-45df-b61a-d2690fbbfb16-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.809915 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85cd8112-bca8-45df-b61a-d2690fbbfb16-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.809952 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25wrx\" (UniqueName: \"kubernetes.io/projected/85cd8112-bca8-45df-b61a-d2690fbbfb16-kube-api-access-25wrx\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.810050 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.810084 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.810133 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85cd8112-bca8-45df-b61a-d2690fbbfb16-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.810195 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.810219 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.911089 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/memcached-0"] Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.912497 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913551 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25wrx\" (UniqueName: \"kubernetes.io/projected/85cd8112-bca8-45df-b61a-d2690fbbfb16-kube-api-access-25wrx\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913656 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913698 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913744 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85cd8112-bca8-45df-b61a-d2690fbbfb16-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913806 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913830 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913869 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85cd8112-bca8-45df-b61a-d2690fbbfb16-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.913893 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85cd8112-bca8-45df-b61a-d2690fbbfb16-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.925443 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"memcached-memcached-dockercfg-f6fvp" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.925589 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"memcached-config-data" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.925760 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-memcached-svc" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.926716 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/85cd8112-bca8-45df-b61a-d2690fbbfb16-combined-ca-bundle\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.926875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-config-data-default\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.926988 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") device mount path \"/mnt/openstack/pv04\"" pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.927155 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/85cd8112-bca8-45df-b61a-d2690fbbfb16-config-data-generated\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.927699 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.928502 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-operator-scripts\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.932103 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/85cd8112-bca8-45df-b61a-d2690fbbfb16-kolla-config\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.932174 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"galera-tls-certs\" (UniqueName: \"kubernetes.io/secret/85cd8112-bca8-45df-b61a-d2690fbbfb16-galera-tls-certs\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.951542 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25wrx\" (UniqueName: \"kubernetes.io/projected/85cd8112-bca8-45df-b61a-d2690fbbfb16-kube-api-access-25wrx\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:36 crc kubenswrapper[4765]: I0319 10:40:36.990199 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"openstack-cell1-galera-0\" (UID: \"85cd8112-bca8-45df-b61a-d2690fbbfb16\") " pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.015508 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zscpq\" (UniqueName: \"kubernetes.io/projected/81d90cd2-d47a-47c5-aeff-20f377ed9159-kube-api-access-zscpq\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.015615 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d90cd2-d47a-47c5-aeff-20f377ed9159-memcached-tls-certs\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.015710 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d90cd2-d47a-47c5-aeff-20f377ed9159-combined-ca-bundle\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.015751 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81d90cd2-d47a-47c5-aeff-20f377ed9159-kolla-config\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.015785 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81d90cd2-d47a-47c5-aeff-20f377ed9159-config-data\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.091482 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstack-cell1-galera-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.117714 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d90cd2-d47a-47c5-aeff-20f377ed9159-combined-ca-bundle\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.117812 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81d90cd2-d47a-47c5-aeff-20f377ed9159-kolla-config\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.117843 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81d90cd2-d47a-47c5-aeff-20f377ed9159-config-data\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.117875 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zscpq\" (UniqueName: \"kubernetes.io/projected/81d90cd2-d47a-47c5-aeff-20f377ed9159-kube-api-access-zscpq\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.117918 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d90cd2-d47a-47c5-aeff-20f377ed9159-memcached-tls-certs\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.121551 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/81d90cd2-d47a-47c5-aeff-20f377ed9159-kolla-config\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.123869 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/81d90cd2-d47a-47c5-aeff-20f377ed9159-combined-ca-bundle\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.124279 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memcached-tls-certs\" (UniqueName: \"kubernetes.io/secret/81d90cd2-d47a-47c5-aeff-20f377ed9159-memcached-tls-certs\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.126392 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/81d90cd2-d47a-47c5-aeff-20f377ed9159-config-data\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.138118 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zscpq\" (UniqueName: \"kubernetes.io/projected/81d90cd2-d47a-47c5-aeff-20f377ed9159-kube-api-access-zscpq\") pod \"memcached-0\" (UID: \"81d90cd2-d47a-47c5-aeff-20f377ed9159\") " pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.273099 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/memcached-0" Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.480073 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:40:37 crc kubenswrapper[4765]: I0319 10:40:37.621507 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" event={"ID":"1ddad710-e7a8-4593-82db-bddeef4de69e","Type":"ContainerStarted","Data":"eb190c05ec32f135adfebd77dcad2034638933aa7e5f7e0f65a85950d61a7c47"} Mar 19 10:40:38 crc kubenswrapper[4765]: W0319 10:40:38.151188 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podccdb0a31_8b87_4024_848f_efebcf46e604.slice/crio-2c27c17a27d18f09dca325f0e2571f97508fb5de9b46463c1871c2b66c4e5bcc WatchSource:0}: Error finding container 2c27c17a27d18f09dca325f0e2571f97508fb5de9b46463c1871c2b66c4e5bcc: Status 404 returned error can't find the container with id 2c27c17a27d18f09dca325f0e2571f97508fb5de9b46463c1871c2b66c4e5bcc Mar 19 10:40:38 crc kubenswrapper[4765]: I0319 10:40:38.509065 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-galera-0"] Mar 19 10:40:38 crc kubenswrapper[4765]: I0319 10:40:38.639001 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:40:38 crc kubenswrapper[4765]: I0319 10:40:38.649511 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ccdb0a31-8b87-4024-848f-efebcf46e604","Type":"ContainerStarted","Data":"2c27c17a27d18f09dca325f0e2571f97508fb5de9b46463c1871c2b66c4e5bcc"} Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.361419 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.364678 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.369372 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"telemetry-ceilometer-dockercfg-l5bwq" Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.371470 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.501171 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bf6dm\" (UniqueName: \"kubernetes.io/projected/2aac07ca-a4d1-4730-ad33-00f6c3d0e418-kube-api-access-bf6dm\") pod \"kube-state-metrics-0\" (UID: \"2aac07ca-a4d1-4730-ad33-00f6c3d0e418\") " pod="openstack/kube-state-metrics-0" Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.602880 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bf6dm\" (UniqueName: \"kubernetes.io/projected/2aac07ca-a4d1-4730-ad33-00f6c3d0e418-kube-api-access-bf6dm\") pod \"kube-state-metrics-0\" (UID: \"2aac07ca-a4d1-4730-ad33-00f6c3d0e418\") " pod="openstack/kube-state-metrics-0" Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.626055 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bf6dm\" (UniqueName: \"kubernetes.io/projected/2aac07ca-a4d1-4730-ad33-00f6c3d0e418-kube-api-access-bf6dm\") pod \"kube-state-metrics-0\" (UID: \"2aac07ca-a4d1-4730-ad33-00f6c3d0e418\") " pod="openstack/kube-state-metrics-0" Mar 19 10:40:39 crc kubenswrapper[4765]: I0319 10:40:39.716282 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.430248 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ct9xj"] Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.432411 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.435635 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovncontroller-ovndbs" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.439253 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-scripts" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.439763 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncontroller-ovncontroller-dockercfg-8zs8f" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.465156 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ovs-bmbgn"] Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.467212 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.482753 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ct9xj"] Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.519677 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bmbgn"] Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560118 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5272132e-561c-46b9-92c8-1714e40b3303-combined-ca-bundle\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560206 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9p2w\" (UniqueName: \"kubernetes.io/projected/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-kube-api-access-j9p2w\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560245 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-etc-ovs\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560268 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tswn\" (UniqueName: \"kubernetes.io/projected/5272132e-561c-46b9-92c8-1714e40b3303-kube-api-access-5tswn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560303 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-run\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560332 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-run-ovn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560351 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-log\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560369 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5272132e-561c-46b9-92c8-1714e40b3303-ovn-controller-tls-certs\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560390 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-log-ovn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560432 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-scripts\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560456 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5272132e-561c-46b9-92c8-1714e40b3303-scripts\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560474 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-run\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.560496 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-lib\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.661874 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-etc-ovs\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.661932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tswn\" (UniqueName: \"kubernetes.io/projected/5272132e-561c-46b9-92c8-1714e40b3303-kube-api-access-5tswn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.661993 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-run\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662018 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-run-ovn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662038 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-log\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662055 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5272132e-561c-46b9-92c8-1714e40b3303-ovn-controller-tls-certs\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662078 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-log-ovn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662120 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-scripts\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662139 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5272132e-561c-46b9-92c8-1714e40b3303-scripts\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662156 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-run\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662179 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-lib\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662211 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5272132e-561c-46b9-92c8-1714e40b3303-combined-ca-bundle\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662251 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9p2w\" (UniqueName: \"kubernetes.io/projected/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-kube-api-access-j9p2w\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662719 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-run-ovn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662773 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-log\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.662842 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-run\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.663038 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-lib\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.663156 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-var-run\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.663251 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/5272132e-561c-46b9-92c8-1714e40b3303-var-log-ovn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.665392 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5272132e-561c-46b9-92c8-1714e40b3303-scripts\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.671791 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-scripts\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.672087 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ovs\" (UniqueName: \"kubernetes.io/host-path/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-etc-ovs\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.675998 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5272132e-561c-46b9-92c8-1714e40b3303-combined-ca-bundle\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.676635 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-controller-tls-certs\" (UniqueName: \"kubernetes.io/secret/5272132e-561c-46b9-92c8-1714e40b3303-ovn-controller-tls-certs\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.686781 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tswn\" (UniqueName: \"kubernetes.io/projected/5272132e-561c-46b9-92c8-1714e40b3303-kube-api-access-5tswn\") pod \"ovn-controller-ct9xj\" (UID: \"5272132e-561c-46b9-92c8-1714e40b3303\") " pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.717611 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9p2w\" (UniqueName: \"kubernetes.io/projected/40f94856-44b1-42f4-9aa4-9b46f3fe13f3-kube-api-access-j9p2w\") pod \"ovn-controller-ovs-bmbgn\" (UID: \"40f94856-44b1-42f4-9aa4-9b46f3fe13f3\") " pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.724360 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2190a046-0d52-49c7-b2fd-aa113c2f3f99","Type":"ContainerStarted","Data":"df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d"} Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.728258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adf887ce-99cf-47a0-89e8-2db5aa92a9ca","Type":"ContainerStarted","Data":"9f335f43c57888644b5a8af33674dd519745892b60bfd5c66368512efbb41063"} Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.777009 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj" Mar 19 10:40:42 crc kubenswrapper[4765]: I0319 10:40:42.824571 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.928693 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.932099 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.937446 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovn-metrics" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.937614 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-sb-dockercfg-lv6td" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.938603 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-config" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.938918 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-sb-ovndbs" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.939089 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-sb-scripts" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.953950 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.984666 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.984708 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.984852 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.984981 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.985022 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.985098 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6gn9\" (UniqueName: \"kubernetes.io/projected/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-kube-api-access-g6gn9\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.985126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:43 crc kubenswrapper[4765]: I0319 10:40:43.985148 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-config\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091186 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091254 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091372 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091429 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091455 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091528 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6gn9\" (UniqueName: \"kubernetes.io/projected/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-kube-api-access-g6gn9\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091545 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-config\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.091563 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.092112 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-ovsdb-rundir\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.092202 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") device mount path \"/mnt/openstack/pv06\"" pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.092838 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-config\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.093165 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-scripts\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.097885 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-combined-ca-bundle\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.098013 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb-tls-certs\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-ovsdbserver-sb-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.098619 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-metrics-certs-tls-certs\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.113991 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6gn9\" (UniqueName: \"kubernetes.io/projected/203ad8ad-1b9e-4191-99a0-7bfd9c193de8-kube-api-access-g6gn9\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.122303 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"ovsdbserver-sb-0\" (UID: \"203ad8ad-1b9e-4191-99a0-7bfd9c193de8\") " pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:44 crc kubenswrapper[4765]: I0319 10:40:44.267998 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-sb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.127207 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.128669 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.132553 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovndbcluster-nb-ovndbs" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.132651 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-scripts" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.132787 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovndbcluster-nb-config" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.133057 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovncluster-ovndbcluster-nb-dockercfg-djprj" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.137808 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237516 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237621 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237663 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17555a74-a31f-4d09-8b23-b8c774024c58-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237693 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237717 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17555a74-a31f-4d09-8b23-b8c774024c58-config\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237794 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2gsgg\" (UniqueName: \"kubernetes.io/projected/17555a74-a31f-4d09-8b23-b8c774024c58-kube-api-access-2gsgg\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237853 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17555a74-a31f-4d09-8b23-b8c774024c58-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.237881 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.338805 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17555a74-a31f-4d09-8b23-b8c774024c58-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.338853 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.338880 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17555a74-a31f-4d09-8b23-b8c774024c58-config\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.338917 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2gsgg\" (UniqueName: \"kubernetes.io/projected/17555a74-a31f-4d09-8b23-b8c774024c58-kube-api-access-2gsgg\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.338989 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17555a74-a31f-4d09-8b23-b8c774024c58-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.339014 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.339048 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.339089 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.339336 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") device mount path \"/mnt/openstack/pv02\"" pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.342318 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/17555a74-a31f-4d09-8b23-b8c774024c58-config\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.342528 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdb-rundir\" (UniqueName: \"kubernetes.io/empty-dir/17555a74-a31f-4d09-8b23-b8c774024c58-ovsdb-rundir\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.355989 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb-tls-certs\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-ovsdbserver-nb-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.356140 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-combined-ca-bundle\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.356363 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/17555a74-a31f-4d09-8b23-b8c774024c58-metrics-certs-tls-certs\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.360513 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.370229 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2gsgg\" (UniqueName: \"kubernetes.io/projected/17555a74-a31f-4d09-8b23-b8c774024c58-kube-api-access-2gsgg\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.382385 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/17555a74-a31f-4d09-8b23-b8c774024c58-scripts\") pod \"ovsdbserver-nb-0\" (UID: \"17555a74-a31f-4d09-8b23-b8c774024c58\") " pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:46 crc kubenswrapper[4765]: I0319 10:40:46.456357 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovsdbserver-nb-0" Mar 19 10:40:47 crc kubenswrapper[4765]: I0319 10:40:47.774090 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstack-cell1-galera-0"] Mar 19 10:40:48 crc kubenswrapper[4765]: E0319 10:40:48.564529 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 10:40:48 crc kubenswrapper[4765]: E0319 10:40:48.564735 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-v9shq,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-mh9dh_openstack(a9774f11-fa76-40b5-8655-f1cd952f9f24): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:40:48 crc kubenswrapper[4765]: E0319 10:40:48.565978 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" podUID="a9774f11-fa76-40b5-8655-f1cd952f9f24" Mar 19 10:40:48 crc kubenswrapper[4765]: E0319 10:40:48.569942 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 19 10:40:48 crc kubenswrapper[4765]: E0319 10:40:48.570186 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-vpr4j,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-bv86p_openstack(921a9df0-bfdb-42f0-82f6-863f86806dda): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:40:48 crc kubenswrapper[4765]: E0319 10:40:48.571307 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" podUID="921a9df0-bfdb-42f0-82f6-863f86806dda" Mar 19 10:40:49 crc kubenswrapper[4765]: W0319 10:40:49.994790 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod85cd8112_bca8_45df_b61a_d2690fbbfb16.slice/crio-c7c0c92b6891f4931cd0066ae60ea776d76a7cdcae557de202e6695992245cad WatchSource:0}: Error finding container c7c0c92b6891f4931cd0066ae60ea776d76a7cdcae557de202e6695992245cad: Status 404 returned error can't find the container with id c7c0c92b6891f4931cd0066ae60ea776d76a7cdcae557de202e6695992245cad Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.082090 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.170672 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.251251 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a9df0-bfdb-42f0-82f6-863f86806dda-config\") pod \"921a9df0-bfdb-42f0-82f6-863f86806dda\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.251442 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpr4j\" (UniqueName: \"kubernetes.io/projected/921a9df0-bfdb-42f0-82f6-863f86806dda-kube-api-access-vpr4j\") pod \"921a9df0-bfdb-42f0-82f6-863f86806dda\" (UID: \"921a9df0-bfdb-42f0-82f6-863f86806dda\") " Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.254275 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/921a9df0-bfdb-42f0-82f6-863f86806dda-config" (OuterVolumeSpecName: "config") pod "921a9df0-bfdb-42f0-82f6-863f86806dda" (UID: "921a9df0-bfdb-42f0-82f6-863f86806dda"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.262703 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/921a9df0-bfdb-42f0-82f6-863f86806dda-kube-api-access-vpr4j" (OuterVolumeSpecName: "kube-api-access-vpr4j") pod "921a9df0-bfdb-42f0-82f6-863f86806dda" (UID: "921a9df0-bfdb-42f0-82f6-863f86806dda"). InnerVolumeSpecName "kube-api-access-vpr4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.353197 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-config\") pod \"a9774f11-fa76-40b5-8655-f1cd952f9f24\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.353252 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-dns-svc\") pod \"a9774f11-fa76-40b5-8655-f1cd952f9f24\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.353418 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9shq\" (UniqueName: \"kubernetes.io/projected/a9774f11-fa76-40b5-8655-f1cd952f9f24-kube-api-access-v9shq\") pod \"a9774f11-fa76-40b5-8655-f1cd952f9f24\" (UID: \"a9774f11-fa76-40b5-8655-f1cd952f9f24\") " Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.353807 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpr4j\" (UniqueName: \"kubernetes.io/projected/921a9df0-bfdb-42f0-82f6-863f86806dda-kube-api-access-vpr4j\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.353849 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/921a9df0-bfdb-42f0-82f6-863f86806dda-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.354519 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-config" (OuterVolumeSpecName: "config") pod "a9774f11-fa76-40b5-8655-f1cd952f9f24" (UID: "a9774f11-fa76-40b5-8655-f1cd952f9f24"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.355034 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a9774f11-fa76-40b5-8655-f1cd952f9f24" (UID: "a9774f11-fa76-40b5-8655-f1cd952f9f24"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.357581 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9774f11-fa76-40b5-8655-f1cd952f9f24-kube-api-access-v9shq" (OuterVolumeSpecName: "kube-api-access-v9shq") pod "a9774f11-fa76-40b5-8655-f1cd952f9f24" (UID: "a9774f11-fa76-40b5-8655-f1cd952f9f24"). InnerVolumeSpecName "kube-api-access-v9shq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.455367 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9shq\" (UniqueName: \"kubernetes.io/projected/a9774f11-fa76-40b5-8655-f1cd952f9f24-kube-api-access-v9shq\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.455410 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.455421 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a9774f11-fa76-40b5-8655-f1cd952f9f24-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.595772 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.677574 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/memcached-0"] Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.728544 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ovs-bmbgn"] Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.803204 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" event={"ID":"a9774f11-fa76-40b5-8655-f1cd952f9f24","Type":"ContainerDied","Data":"fe4d72db1a91f60a6edc865ed89a069a02db162bca4e9c131943e4d18d17e841"} Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.803320 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-mh9dh" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.806092 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" event={"ID":"921a9df0-bfdb-42f0-82f6-863f86806dda","Type":"ContainerDied","Data":"a5c8c44d72373198c5da0e48d97bc37b32f51331976905a8f96dffed23830af5"} Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.806177 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-bv86p" Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.809105 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85cd8112-bca8-45df-b61a-d2690fbbfb16","Type":"ContainerStarted","Data":"c7c0c92b6891f4931cd0066ae60ea776d76a7cdcae557de202e6695992245cad"} Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.860540 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bv86p"] Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.874454 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-bv86p"] Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.897534 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mh9dh"] Mar 19 10:40:50 crc kubenswrapper[4765]: I0319 10:40:50.904726 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-mh9dh"] Mar 19 10:40:51 crc kubenswrapper[4765]: I0319 10:40:51.881047 4765 scope.go:117] "RemoveContainer" containerID="b60c0e4a6754265a9c7b14fd3467c8216f5d00dac9f3e0e7fa7e292fe00bf27f" Mar 19 10:40:52 crc kubenswrapper[4765]: I0319 10:40:52.366048 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="921a9df0-bfdb-42f0-82f6-863f86806dda" path="/var/lib/kubelet/pods/921a9df0-bfdb-42f0-82f6-863f86806dda/volumes" Mar 19 10:40:52 crc kubenswrapper[4765]: I0319 10:40:52.366495 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9774f11-fa76-40b5-8655-f1cd952f9f24" path="/var/lib/kubelet/pods/a9774f11-fa76-40b5-8655-f1cd952f9f24/volumes" Mar 19 10:40:55 crc kubenswrapper[4765]: W0319 10:40:55.098240 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40f94856_44b1_42f4_9aa4_9b46f3fe13f3.slice/crio-8e3968a1dd4700a05f3913cfb499482dc7b6ff8c9458319d16e8eb08480c19e3 WatchSource:0}: Error finding container 8e3968a1dd4700a05f3913cfb499482dc7b6ff8c9458319d16e8eb08480c19e3: Status 404 returned error can't find the container with id 8e3968a1dd4700a05f3913cfb499482dc7b6ff8c9458319d16e8eb08480c19e3 Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.511230 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ct9xj"] Mar 19 10:40:55 crc kubenswrapper[4765]: W0319 10:40:55.621253 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5272132e_561c_46b9_92c8_1714e40b3303.slice/crio-5c4b18c9587bf25d8b8968a4d47db3da002fad946be1776dd7ce1c03952659b4 WatchSource:0}: Error finding container 5c4b18c9587bf25d8b8968a4d47db3da002fad946be1776dd7ce1c03952659b4: Status 404 returned error can't find the container with id 5c4b18c9587bf25d8b8968a4d47db3da002fad946be1776dd7ce1c03952659b4 Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.716331 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-sb-0"] Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.885845 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85cd8112-bca8-45df-b61a-d2690fbbfb16","Type":"ContainerStarted","Data":"f2cff6bd713dfc208e286925a259743d527bcf980bae75c781f0ddaa8d2f6408"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.887844 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adf887ce-99cf-47a0-89e8-2db5aa92a9ca","Type":"ContainerStarted","Data":"4c8304a899a8998b8cbbc3c3a80901cbcceb5111d0e33db9fb3807e04e3bf207"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.890846 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bmbgn" event={"ID":"40f94856-44b1-42f4-9aa4-9b46f3fe13f3","Type":"ContainerStarted","Data":"8e3968a1dd4700a05f3913cfb499482dc7b6ff8c9458319d16e8eb08480c19e3"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.894649 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj" event={"ID":"5272132e-561c-46b9-92c8-1714e40b3303","Type":"ContainerStarted","Data":"5c4b18c9587bf25d8b8968a4d47db3da002fad946be1776dd7ce1c03952659b4"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.897104 4765 generic.go:334] "Generic (PLEG): container finished" podID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerID="99a58ef94b3ce25dde495aaf9b57e544ef78cb3150b346c560b7e96695da1821" exitCode=0 Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.897169 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" event={"ID":"1ddad710-e7a8-4593-82db-bddeef4de69e","Type":"ContainerDied","Data":"99a58ef94b3ce25dde495aaf9b57e544ef78cb3150b346c560b7e96695da1821"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.902443 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"81d90cd2-d47a-47c5-aeff-20f377ed9159","Type":"ContainerStarted","Data":"3437baec107301314149c790e0e307a081e885845725a9d3d23c1fecbd184ef1"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.906004 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"203ad8ad-1b9e-4191-99a0-7bfd9c193de8","Type":"ContainerStarted","Data":"31f9ac6aa5e0e9f2d84ed0e09839532a7a1df0e1cb2b8337805f1ff8bef54a3a"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.914933 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aac07ca-a4d1-4730-ad33-00f6c3d0e418","Type":"ContainerStarted","Data":"3223c31b0e270fc0aed39b662ac1ef5a200ba871886471426060c50c14abbeff"} Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.922092 4765 generic.go:334] "Generic (PLEG): container finished" podID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerID="78d94cb29f97abba53b5187bbe469bc78608c941973177a633b4f7b638b353c8" exitCode=0 Mar 19 10:40:55 crc kubenswrapper[4765]: I0319 10:40:55.922162 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" event={"ID":"cf28f1ab-e0a6-4481-bb82-4cd47321520a","Type":"ContainerDied","Data":"78d94cb29f97abba53b5187bbe469bc78608c941973177a633b4f7b638b353c8"} Mar 19 10:40:56 crc kubenswrapper[4765]: I0319 10:40:56.572917 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovsdbserver-nb-0"] Mar 19 10:40:56 crc kubenswrapper[4765]: I0319 10:40:56.934808 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2190a046-0d52-49c7-b2fd-aa113c2f3f99","Type":"ContainerStarted","Data":"211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12"} Mar 19 10:40:56 crc kubenswrapper[4765]: I0319 10:40:56.939217 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ccdb0a31-8b87-4024-848f-efebcf46e604","Type":"ContainerStarted","Data":"ede7c513464d77c5408931eb19bc0afaf30f0f27c91691fc636041f35a22d68b"} Mar 19 10:40:56 crc kubenswrapper[4765]: I0319 10:40:56.941152 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"17555a74-a31f-4d09-8b23-b8c774024c58","Type":"ContainerStarted","Data":"2006054549ca5ce65ef22d56f72f18170744ca0eaa3956cbcf1cbdb846e6ef22"} Mar 19 10:40:59 crc kubenswrapper[4765]: I0319 10:40:59.970545 4765 generic.go:334] "Generic (PLEG): container finished" podID="adf887ce-99cf-47a0-89e8-2db5aa92a9ca" containerID="4c8304a899a8998b8cbbc3c3a80901cbcceb5111d0e33db9fb3807e04e3bf207" exitCode=0 Mar 19 10:40:59 crc kubenswrapper[4765]: I0319 10:40:59.970655 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adf887ce-99cf-47a0-89e8-2db5aa92a9ca","Type":"ContainerDied","Data":"4c8304a899a8998b8cbbc3c3a80901cbcceb5111d0e33db9fb3807e04e3bf207"} Mar 19 10:40:59 crc kubenswrapper[4765]: I0319 10:40:59.974256 4765 generic.go:334] "Generic (PLEG): container finished" podID="85cd8112-bca8-45df-b61a-d2690fbbfb16" containerID="f2cff6bd713dfc208e286925a259743d527bcf980bae75c781f0ddaa8d2f6408" exitCode=0 Mar 19 10:40:59 crc kubenswrapper[4765]: I0319 10:40:59.974299 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85cd8112-bca8-45df-b61a-d2690fbbfb16","Type":"ContainerDied","Data":"f2cff6bd713dfc208e286925a259743d527bcf980bae75c781f0ddaa8d2f6408"} Mar 19 10:41:00 crc kubenswrapper[4765]: I0319 10:41:00.992135 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-cell1-galera-0" event={"ID":"85cd8112-bca8-45df-b61a-d2690fbbfb16","Type":"ContainerStarted","Data":"5e60dca2c59a1c9176d0aee418fb6d680d0dc42753facb257a617098cc38d8e7"} Mar 19 10:41:00 crc kubenswrapper[4765]: I0319 10:41:00.995985 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstack-galera-0" event={"ID":"adf887ce-99cf-47a0-89e8-2db5aa92a9ca","Type":"ContainerStarted","Data":"6d31d29f76ed2fdf9b85c78667da3a771068902405d4a747383d5e44a60140ea"} Mar 19 10:41:01 crc kubenswrapper[4765]: I0319 10:41:01.001263 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" event={"ID":"cf28f1ab-e0a6-4481-bb82-4cd47321520a","Type":"ContainerStarted","Data":"ac17fbc2f13baa25f279df75a69dd2bc1c669cc8be0926e3356cd36013cd3c4b"} Mar 19 10:41:01 crc kubenswrapper[4765]: I0319 10:41:01.001596 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:41:01 crc kubenswrapper[4765]: I0319 10:41:01.003182 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" event={"ID":"1ddad710-e7a8-4593-82db-bddeef4de69e","Type":"ContainerStarted","Data":"c12291d97997acb9a21d3c96eb756312edf666fedc21d3b338c7f4bdeaa3d361"} Mar 19 10:41:01 crc kubenswrapper[4765]: I0319 10:41:01.003679 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:41:01 crc kubenswrapper[4765]: I0319 10:41:01.042860 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" podStartSLOduration=9.607041544 podStartE2EDuration="28.0428402s" podCreationTimestamp="2026-03-19 10:40:33 +0000 UTC" firstStartedPulling="2026-03-19 10:40:36.843039803 +0000 UTC m=+1135.191985345" lastFinishedPulling="2026-03-19 10:40:55.278838459 +0000 UTC m=+1153.627784001" observedRunningTime="2026-03-19 10:41:01.037794294 +0000 UTC m=+1159.386739836" watchObservedRunningTime="2026-03-19 10:41:01.0428402 +0000 UTC m=+1159.391785742" Mar 19 10:41:01 crc kubenswrapper[4765]: I0319 10:41:01.043998 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-cell1-galera-0" podStartSLOduration=20.78913547 podStartE2EDuration="26.043993032s" podCreationTimestamp="2026-03-19 10:40:35 +0000 UTC" firstStartedPulling="2026-03-19 10:40:50.025360964 +0000 UTC m=+1148.374306506" lastFinishedPulling="2026-03-19 10:40:55.280218526 +0000 UTC m=+1153.629164068" observedRunningTime="2026-03-19 10:41:01.022045896 +0000 UTC m=+1159.370991448" watchObservedRunningTime="2026-03-19 10:41:01.043993032 +0000 UTC m=+1159.392938574" Mar 19 10:41:01 crc kubenswrapper[4765]: I0319 10:41:01.070693 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstack-galera-0" podStartSLOduration=13.493897836 podStartE2EDuration="27.070665605s" podCreationTimestamp="2026-03-19 10:40:34 +0000 UTC" firstStartedPulling="2026-03-19 10:40:41.704945778 +0000 UTC m=+1140.053891320" lastFinishedPulling="2026-03-19 10:40:55.281713547 +0000 UTC m=+1153.630659089" observedRunningTime="2026-03-19 10:41:01.058990998 +0000 UTC m=+1159.407936540" watchObservedRunningTime="2026-03-19 10:41:01.070665605 +0000 UTC m=+1159.419611147" Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.012979 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj" event={"ID":"5272132e-561c-46b9-92c8-1714e40b3303","Type":"ContainerStarted","Data":"5c900c78c6f628c56339b958c8b259557eb265d41ed4dd637d513d380e87871a"} Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.014409 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ct9xj" Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.014537 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"17555a74-a31f-4d09-8b23-b8c774024c58","Type":"ContainerStarted","Data":"5abb1b04fd8d43fafc3cf222ec0ed4f72d1c69b0574ddf71e195bc81d213ea7c"} Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.016183 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/memcached-0" event={"ID":"81d90cd2-d47a-47c5-aeff-20f377ed9159","Type":"ContainerStarted","Data":"ec2b1a2adf685b6fdd6cdeeaa09c0a2e03afcbe42fed9a7aff639c845fdfc934"} Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.016365 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/memcached-0" Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.018088 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"203ad8ad-1b9e-4191-99a0-7bfd9c193de8","Type":"ContainerStarted","Data":"4204c693dc29f562b8c3e860c0377b11d1ed26121f1631ea27ef2aecbfa9f734"} Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.020929 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aac07ca-a4d1-4730-ad33-00f6c3d0e418","Type":"ContainerStarted","Data":"b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a"} Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.021073 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.022257 4765 generic.go:334] "Generic (PLEG): container finished" podID="40f94856-44b1-42f4-9aa4-9b46f3fe13f3" containerID="4e38b34bd7987cac91e856e596d09613aa2b8ff29ebceb135454b76241ad82ca" exitCode=0 Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.022614 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bmbgn" event={"ID":"40f94856-44b1-42f4-9aa4-9b46f3fe13f3","Type":"ContainerDied","Data":"4e38b34bd7987cac91e856e596d09613aa2b8ff29ebceb135454b76241ad82ca"} Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.040818 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ct9xj" podStartSLOduration=15.02211904 podStartE2EDuration="20.040793366s" podCreationTimestamp="2026-03-19 10:40:42 +0000 UTC" firstStartedPulling="2026-03-19 10:40:55.63392433 +0000 UTC m=+1153.982869872" lastFinishedPulling="2026-03-19 10:41:00.652598656 +0000 UTC m=+1159.001544198" observedRunningTime="2026-03-19 10:41:02.038209336 +0000 UTC m=+1160.387154878" watchObservedRunningTime="2026-03-19 10:41:02.040793366 +0000 UTC m=+1160.389738918" Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.041109 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" podStartSLOduration=8.614270635 podStartE2EDuration="30.041104384s" podCreationTimestamp="2026-03-19 10:40:32 +0000 UTC" firstStartedPulling="2026-03-19 10:40:33.790366718 +0000 UTC m=+1132.139312260" lastFinishedPulling="2026-03-19 10:40:55.217200467 +0000 UTC m=+1153.566146009" observedRunningTime="2026-03-19 10:41:01.084306555 +0000 UTC m=+1159.433252097" watchObservedRunningTime="2026-03-19 10:41:02.041104384 +0000 UTC m=+1160.390049916" Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.098147 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/memcached-0" podStartSLOduration=21.566802162 podStartE2EDuration="26.098125481s" podCreationTimestamp="2026-03-19 10:40:36 +0000 UTC" firstStartedPulling="2026-03-19 10:40:55.096497353 +0000 UTC m=+1153.445442895" lastFinishedPulling="2026-03-19 10:40:59.627820672 +0000 UTC m=+1157.976766214" observedRunningTime="2026-03-19 10:41:02.087834372 +0000 UTC m=+1160.436779924" watchObservedRunningTime="2026-03-19 10:41:02.098125481 +0000 UTC m=+1160.447071013" Mar 19 10:41:02 crc kubenswrapper[4765]: I0319 10:41:02.117078 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=17.450727801 podStartE2EDuration="23.117052714s" podCreationTimestamp="2026-03-19 10:40:39 +0000 UTC" firstStartedPulling="2026-03-19 10:40:55.095498836 +0000 UTC m=+1153.444444378" lastFinishedPulling="2026-03-19 10:41:00.761823749 +0000 UTC m=+1159.110769291" observedRunningTime="2026-03-19 10:41:02.100247839 +0000 UTC m=+1160.449193391" watchObservedRunningTime="2026-03-19 10:41:02.117052714 +0000 UTC m=+1160.465998256" Mar 19 10:41:03 crc kubenswrapper[4765]: I0319 10:41:03.034904 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bmbgn" event={"ID":"40f94856-44b1-42f4-9aa4-9b46f3fe13f3","Type":"ContainerStarted","Data":"083c2e0c31eb535bf30d42037dd082eb1ce61654b107c057c9d0bbea197d35fa"} Mar 19 10:41:03 crc kubenswrapper[4765]: I0319 10:41:03.035246 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ovs-bmbgn" event={"ID":"40f94856-44b1-42f4-9aa4-9b46f3fe13f3","Type":"ContainerStarted","Data":"35596c590d32efc086f8b4663497bb6c064527c7b92e1dbf322a3a2f022621fb"} Mar 19 10:41:03 crc kubenswrapper[4765]: I0319 10:41:03.064308 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-ovs-bmbgn" podStartSLOduration=16.105108103 podStartE2EDuration="21.064291986s" podCreationTimestamp="2026-03-19 10:40:42 +0000 UTC" firstStartedPulling="2026-03-19 10:40:55.105293472 +0000 UTC m=+1153.454239024" lastFinishedPulling="2026-03-19 10:41:00.064477365 +0000 UTC m=+1158.413422907" observedRunningTime="2026-03-19 10:41:03.06075236 +0000 UTC m=+1161.409697902" watchObservedRunningTime="2026-03-19 10:41:03.064291986 +0000 UTC m=+1161.413237528" Mar 19 10:41:04 crc kubenswrapper[4765]: I0319 10:41:04.045003 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:41:04 crc kubenswrapper[4765]: I0319 10:41:04.045394 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:41:05 crc kubenswrapper[4765]: I0319 10:41:05.650292 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-galera-0" Mar 19 10:41:05 crc kubenswrapper[4765]: I0319 10:41:05.651269 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-galera-0" Mar 19 10:41:05 crc kubenswrapper[4765]: I0319 10:41:05.769905 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-galera-0" Mar 19 10:41:06 crc kubenswrapper[4765]: I0319 10:41:06.061362 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-sb-0" event={"ID":"203ad8ad-1b9e-4191-99a0-7bfd9c193de8","Type":"ContainerStarted","Data":"070f7c75f55ffafc13b19d633d4c607200c808da2ccecfa0d0cd20110fb4f037"} Mar 19 10:41:06 crc kubenswrapper[4765]: I0319 10:41:06.064366 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovsdbserver-nb-0" event={"ID":"17555a74-a31f-4d09-8b23-b8c774024c58","Type":"ContainerStarted","Data":"29a909f663b9f10ec635244bd86182366411ee51e254e3f2881db673f2137a33"} Mar 19 10:41:06 crc kubenswrapper[4765]: I0319 10:41:06.089855 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-sb-0" podStartSLOduration=14.390003006 podStartE2EDuration="24.089828944s" podCreationTimestamp="2026-03-19 10:40:42 +0000 UTC" firstStartedPulling="2026-03-19 10:40:55.745889546 +0000 UTC m=+1154.094835088" lastFinishedPulling="2026-03-19 10:41:05.445715484 +0000 UTC m=+1163.794661026" observedRunningTime="2026-03-19 10:41:06.083501583 +0000 UTC m=+1164.432447135" watchObservedRunningTime="2026-03-19 10:41:06.089828944 +0000 UTC m=+1164.438774496" Mar 19 10:41:06 crc kubenswrapper[4765]: I0319 10:41:06.114491 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovsdbserver-nb-0" podStartSLOduration=12.308671812 podStartE2EDuration="21.114466452s" podCreationTimestamp="2026-03-19 10:40:45 +0000 UTC" firstStartedPulling="2026-03-19 10:40:56.675428587 +0000 UTC m=+1155.024374129" lastFinishedPulling="2026-03-19 10:41:05.481223227 +0000 UTC m=+1163.830168769" observedRunningTime="2026-03-19 10:41:06.100941806 +0000 UTC m=+1164.449887358" watchObservedRunningTime="2026-03-19 10:41:06.114466452 +0000 UTC m=+1164.463411994" Mar 19 10:41:06 crc kubenswrapper[4765]: I0319 10:41:06.151105 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-galera-0" Mar 19 10:41:06 crc kubenswrapper[4765]: I0319 10:41:06.457250 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-nb-0" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.092561 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/openstack-cell1-galera-0" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.092913 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/openstack-cell1-galera-0" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.178921 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/openstack-cell1-galera-0" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.275225 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/memcached-0" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.457248 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-nb-0" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.496524 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-nb-0" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.565378 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-create-gn9lj"] Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.566764 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.584262 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-505a-account-create-update-gp6pz"] Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.585435 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.587709 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-db-secret" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.597574 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gn9lj"] Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.632744 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-505a-account-create-update-gp6pz"] Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.660251 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52s6l\" (UniqueName: \"kubernetes.io/projected/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-kube-api-access-52s6l\") pod \"glance-db-create-gn9lj\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.660309 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777c1144-ff82-4ac0-a887-e5859bccf142-operator-scripts\") pod \"glance-505a-account-create-update-gp6pz\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.660579 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-operator-scripts\") pod \"glance-db-create-gn9lj\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.660695 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2tf8k\" (UniqueName: \"kubernetes.io/projected/777c1144-ff82-4ac0-a887-e5859bccf142-kube-api-access-2tf8k\") pod \"glance-505a-account-create-update-gp6pz\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.762870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-52s6l\" (UniqueName: \"kubernetes.io/projected/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-kube-api-access-52s6l\") pod \"glance-db-create-gn9lj\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.762933 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777c1144-ff82-4ac0-a887-e5859bccf142-operator-scripts\") pod \"glance-505a-account-create-update-gp6pz\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.763021 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-operator-scripts\") pod \"glance-db-create-gn9lj\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.763060 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2tf8k\" (UniqueName: \"kubernetes.io/projected/777c1144-ff82-4ac0-a887-e5859bccf142-kube-api-access-2tf8k\") pod \"glance-505a-account-create-update-gp6pz\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.764068 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777c1144-ff82-4ac0-a887-e5859bccf142-operator-scripts\") pod \"glance-505a-account-create-update-gp6pz\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.764214 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-operator-scripts\") pod \"glance-db-create-gn9lj\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.789578 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2tf8k\" (UniqueName: \"kubernetes.io/projected/777c1144-ff82-4ac0-a887-e5859bccf142-kube-api-access-2tf8k\") pod \"glance-505a-account-create-update-gp6pz\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.789611 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-52s6l\" (UniqueName: \"kubernetes.io/projected/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-kube-api-access-52s6l\") pod \"glance-db-create-gn9lj\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.887105 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:07 crc kubenswrapper[4765]: I0319 10:41:07.912776 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.154411 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-nb-0" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.226147 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.228340 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/openstack-cell1-galera-0" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.270941 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/ovsdbserver-sb-0" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.343563 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-create-25h9b"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.344676 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.354176 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-25h9b"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.387033 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/ovsdbserver-sb-0" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.457044 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-create-gn9lj"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.465383 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-3f4d-account-create-update-qmrd4"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.466777 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.475574 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w4rdx\" (UniqueName: \"kubernetes.io/projected/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-kube-api-access-w4rdx\") pod \"keystone-db-create-25h9b\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.475703 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-operator-scripts\") pod \"keystone-db-create-25h9b\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.476387 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-db-secret" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.487690 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3f4d-account-create-update-qmrd4"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.500469 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n9ml8"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.500855 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" podUID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerName="dnsmasq-dns" containerID="cri-o://c12291d97997acb9a21d3c96eb756312edf666fedc21d3b338c7f4bdeaa3d361" gracePeriod=10 Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.523738 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.552010 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-r4cpf"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.554563 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.570043 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-505a-account-create-update-gp6pz"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.571486 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-nb" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.571052 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-r4cpf"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.578489 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-operator-scripts\") pod \"keystone-3f4d-account-create-update-qmrd4\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.578624 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4px4\" (UniqueName: \"kubernetes.io/projected/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-kube-api-access-p4px4\") pod \"keystone-3f4d-account-create-update-qmrd4\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.578728 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w4rdx\" (UniqueName: \"kubernetes.io/projected/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-kube-api-access-w4rdx\") pod \"keystone-db-create-25h9b\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.578804 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-operator-scripts\") pod \"keystone-db-create-25h9b\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: W0319 10:41:08.579511 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod777c1144_ff82_4ac0_a887_e5859bccf142.slice/crio-735f60d593073012b9cd4e09d42716aa080b8b51ad666afa2054fa50bfd397d6 WatchSource:0}: Error finding container 735f60d593073012b9cd4e09d42716aa080b8b51ad666afa2054fa50bfd397d6: Status 404 returned error can't find the container with id 735f60d593073012b9cd4e09d42716aa080b8b51ad666afa2054fa50bfd397d6 Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.583885 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-operator-scripts\") pod \"keystone-db-create-25h9b\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.633854 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-metrics-sq8vg"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.635190 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.642952 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w4rdx\" (UniqueName: \"kubernetes.io/projected/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-kube-api-access-w4rdx\") pod \"keystone-db-create-25h9b\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.663511 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-metrics-config" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.688083 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sq8vg"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.688445 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.690878 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.691119 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-operator-scripts\") pod \"keystone-3f4d-account-create-update-qmrd4\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.691241 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsskj\" (UniqueName: \"kubernetes.io/projected/61c50130-b1a8-4d94-99e1-02c523a426a7-kube-api-access-hsskj\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.691368 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.691494 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-config\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.691599 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p4px4\" (UniqueName: \"kubernetes.io/projected/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-kube-api-access-p4px4\") pod \"keystone-3f4d-account-create-update-qmrd4\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.702243 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-operator-scripts\") pod \"keystone-3f4d-account-create-update-qmrd4\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.745471 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4px4\" (UniqueName: \"kubernetes.io/projected/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-kube-api-access-p4px4\") pod \"keystone-3f4d-account-create-update-qmrd4\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.752049 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-create-gftrk"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.753497 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gftrk" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795203 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-combined-ca-bundle\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795292 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-ovs-rundir\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795356 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795396 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hsskj\" (UniqueName: \"kubernetes.io/projected/61c50130-b1a8-4d94-99e1-02c523a426a7-kube-api-access-hsskj\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795423 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-config\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795444 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795462 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5r6\" (UniqueName: \"kubernetes.io/projected/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-kube-api-access-rk5r6\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795488 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-config\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795528 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-ovn-rundir\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.795551 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.798210 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-ovsdbserver-nb\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.798435 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-config\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.807235 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gftrk"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.813645 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-dns-svc\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.841261 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-eb4b-account-create-update-jtnjd"] Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.843195 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.843483 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsskj\" (UniqueName: \"kubernetes.io/projected/61c50130-b1a8-4d94-99e1-02c523a426a7-kube-api-access-hsskj\") pod \"dnsmasq-dns-5bf47b49b7-r4cpf\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.858701 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.878740 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-db-secret" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.898986 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.899095 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-combined-ca-bundle\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.899184 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-ovs-rundir\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.899250 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31509f20-2f87-4039-b2fc-7a65a11e34e8-operator-scripts\") pod \"placement-db-create-gftrk\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " pod="openstack/placement-db-create-gftrk" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.899314 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7r58\" (UniqueName: \"kubernetes.io/projected/31509f20-2f87-4039-b2fc-7a65a11e34e8-kube-api-access-w7r58\") pod \"placement-db-create-gftrk\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " pod="openstack/placement-db-create-gftrk" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.899438 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-config\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.899495 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5r6\" (UniqueName: \"kubernetes.io/projected/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-kube-api-access-rk5r6\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.899555 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-ovn-rundir\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.911420 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-metrics-certs-tls-certs\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.912773 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-rundir\" (UniqueName: \"kubernetes.io/host-path/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-ovs-rundir\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.913342 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-config\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.930621 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/host-path/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-ovn-rundir\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:08 crc kubenswrapper[4765]: I0319 10:41:08.970628 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.002239 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-combined-ca-bundle\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.003945 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31509f20-2f87-4039-b2fc-7a65a11e34e8-operator-scripts\") pod \"placement-db-create-gftrk\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " pod="openstack/placement-db-create-gftrk" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.004036 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fxgw5\" (UniqueName: \"kubernetes.io/projected/3cca3085-ce1d-43c5-ada0-89d57e6ce578-kube-api-access-fxgw5\") pod \"placement-eb4b-account-create-update-jtnjd\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.004094 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w7r58\" (UniqueName: \"kubernetes.io/projected/31509f20-2f87-4039-b2fc-7a65a11e34e8-kube-api-access-w7r58\") pod \"placement-db-create-gftrk\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " pod="openstack/placement-db-create-gftrk" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.004151 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cca3085-ce1d-43c5-ada0-89d57e6ce578-operator-scripts\") pod \"placement-eb4b-account-create-update-jtnjd\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.005554 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31509f20-2f87-4039-b2fc-7a65a11e34e8-operator-scripts\") pod \"placement-db-create-gftrk\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " pod="openstack/placement-db-create-gftrk" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.018120 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb4b-account-create-update-jtnjd"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.044807 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7r58\" (UniqueName: \"kubernetes.io/projected/31509f20-2f87-4039-b2fc-7a65a11e34e8-kube-api-access-w7r58\") pod \"placement-db-create-gftrk\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " pod="openstack/placement-db-create-gftrk" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.057419 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5r6\" (UniqueName: \"kubernetes.io/projected/c2dd6b2c-bf15-47e4-b9c6-775b176fbadb-kube-api-access-rk5r6\") pod \"ovn-controller-metrics-sq8vg\" (UID: \"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb\") " pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.089207 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gftrk" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.107412 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fxgw5\" (UniqueName: \"kubernetes.io/projected/3cca3085-ce1d-43c5-ada0-89d57e6ce578-kube-api-access-fxgw5\") pod \"placement-eb4b-account-create-update-jtnjd\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.107490 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cca3085-ce1d-43c5-ada0-89d57e6ce578-operator-scripts\") pod \"placement-eb4b-account-create-update-jtnjd\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.108915 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cca3085-ce1d-43c5-ada0-89d57e6ce578-operator-scripts\") pod \"placement-eb4b-account-create-update-jtnjd\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.151185 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fxgw5\" (UniqueName: \"kubernetes.io/projected/3cca3085-ce1d-43c5-ada0-89d57e6ce578-kube-api-access-fxgw5\") pod \"placement-eb4b-account-create-update-jtnjd\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.166835 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-r4cpf"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.169730 4765 generic.go:334] "Generic (PLEG): container finished" podID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerID="c12291d97997acb9a21d3c96eb756312edf666fedc21d3b338c7f4bdeaa3d361" exitCode=0 Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.169862 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" event={"ID":"1ddad710-e7a8-4593-82db-bddeef4de69e","Type":"ContainerDied","Data":"c12291d97997acb9a21d3c96eb756312edf666fedc21d3b338c7f4bdeaa3d361"} Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.173711 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gn9lj" event={"ID":"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c","Type":"ContainerStarted","Data":"f81cbe6cb2cef60c299a01f892018ef810434f3bfb6700702498f6e6592d92b3"} Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.180555 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-505a-account-create-update-gp6pz" event={"ID":"777c1144-ff82-4ac0-a887-e5859bccf142","Type":"ContainerStarted","Data":"735f60d593073012b9cd4e09d42716aa080b8b51ad666afa2054fa50bfd397d6"} Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.181572 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovsdbserver-sb-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.202514 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-8554648995-hfwjh"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.213647 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.215985 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovsdbserver-sb" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.234919 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-8554648995-hfwjh"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.318183 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b5c6c\" (UniqueName: \"kubernetes.io/projected/7227feea-b1ef-4e43-a224-ef1b078fc070-kube-api-access-b5c6c\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.318316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.318355 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-dns-svc\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.318390 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-config\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.318436 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.324458 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-metrics-sq8vg" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.340584 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovsdbserver-sb-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.414325 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.432011 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b5c6c\" (UniqueName: \"kubernetes.io/projected/7227feea-b1ef-4e43-a224-ef1b078fc070-kube-api-access-b5c6c\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.434222 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.434342 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-dns-svc\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.434868 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-config\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.434990 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-sb\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.435236 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.436133 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-nb\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.436156 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-config\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.436508 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-dns-svc\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.466017 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b5c6c\" (UniqueName: \"kubernetes.io/projected/7227feea-b1ef-4e43-a224-ef1b078fc070-kube-api-access-b5c6c\") pod \"dnsmasq-dns-8554648995-hfwjh\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.551796 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-northd-0"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.553727 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.555446 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ovnnorthd-ovndbs" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.555446 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ovnnorthd-ovnnorthd-dockercfg-rgklx" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.557781 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-scripts" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.558027 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovnnorthd-config" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.572566 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.601002 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.644888 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95192c6a-3899-4f62-bfca-47ad91bd17f1-scripts\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.645027 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.645081 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.645116 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95192c6a-3899-4f62-bfca-47ad91bd17f1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.645162 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dcdr\" (UniqueName: \"kubernetes.io/projected/95192c6a-3899-4f62-bfca-47ad91bd17f1-kube-api-access-5dcdr\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.645264 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95192c6a-3899-4f62-bfca-47ad91bd17f1-config\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.645321 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.705348 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.751424 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.752423 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-dns-svc\") pod \"1ddad710-e7a8-4593-82db-bddeef4de69e\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.752546 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-config\") pod \"1ddad710-e7a8-4593-82db-bddeef4de69e\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.752696 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4w5vq\" (UniqueName: \"kubernetes.io/projected/1ddad710-e7a8-4593-82db-bddeef4de69e-kube-api-access-4w5vq\") pod \"1ddad710-e7a8-4593-82db-bddeef4de69e\" (UID: \"1ddad710-e7a8-4593-82db-bddeef4de69e\") " Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.753085 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95192c6a-3899-4f62-bfca-47ad91bd17f1-config\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.753137 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.753194 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95192c6a-3899-4f62-bfca-47ad91bd17f1-scripts\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.753247 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.753273 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.753301 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95192c6a-3899-4f62-bfca-47ad91bd17f1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.753324 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5dcdr\" (UniqueName: \"kubernetes.io/projected/95192c6a-3899-4f62-bfca-47ad91bd17f1-kube-api-access-5dcdr\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.757607 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-rundir\" (UniqueName: \"kubernetes.io/empty-dir/95192c6a-3899-4f62-bfca-47ad91bd17f1-ovn-rundir\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.763590 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/95192c6a-3899-4f62-bfca-47ad91bd17f1-scripts\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.764304 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/95192c6a-3899-4f62-bfca-47ad91bd17f1-config\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.764977 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs-tls-certs\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-metrics-certs-tls-certs\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.780134 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-northd-tls-certs\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-ovn-northd-tls-certs\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.787653 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ddad710-e7a8-4593-82db-bddeef4de69e-kube-api-access-4w5vq" (OuterVolumeSpecName: "kube-api-access-4w5vq") pod "1ddad710-e7a8-4593-82db-bddeef4de69e" (UID: "1ddad710-e7a8-4593-82db-bddeef4de69e"). InnerVolumeSpecName "kube-api-access-4w5vq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.798189 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dcdr\" (UniqueName: \"kubernetes.io/projected/95192c6a-3899-4f62-bfca-47ad91bd17f1-kube-api-access-5dcdr\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.798429 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/95192c6a-3899-4f62-bfca-47ad91bd17f1-combined-ca-bundle\") pod \"ovn-northd-0\" (UID: \"95192c6a-3899-4f62-bfca-47ad91bd17f1\") " pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.835713 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-hfwjh"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.862772 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4w5vq\" (UniqueName: \"kubernetes.io/projected/1ddad710-e7a8-4593-82db-bddeef4de69e-kube-api-access-4w5vq\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.866940 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-create-25h9b"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.876500 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-config" (OuterVolumeSpecName: "config") pod "1ddad710-e7a8-4593-82db-bddeef4de69e" (UID: "1ddad710-e7a8-4593-82db-bddeef4de69e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.891213 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-northd-0" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.897275 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2qgsd"] Mar 19 10:41:09 crc kubenswrapper[4765]: E0319 10:41:09.897705 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerName="init" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.897717 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerName="init" Mar 19 10:41:09 crc kubenswrapper[4765]: E0319 10:41:09.897730 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerName="dnsmasq-dns" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.897738 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerName="dnsmasq-dns" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.897889 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ddad710-e7a8-4593-82db-bddeef4de69e" containerName="dnsmasq-dns" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.899055 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:09 crc kubenswrapper[4765]: W0319 10:41:09.904591 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcb59bf52_1cd4_4d26_9c5d_9ee4561267c5.slice/crio-313b158ebb4a060fb04a79067bd2ecb1a44ad597821a1dd25fb9fb6470f82820 WatchSource:0}: Error finding container 313b158ebb4a060fb04a79067bd2ecb1a44ad597821a1dd25fb9fb6470f82820: Status 404 returned error can't find the container with id 313b158ebb4a060fb04a79067bd2ecb1a44ad597821a1dd25fb9fb6470f82820 Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.918714 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "1ddad710-e7a8-4593-82db-bddeef4de69e" (UID: "1ddad710-e7a8-4593-82db-bddeef4de69e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.919204 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2qgsd"] Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.964117 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.964171 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.964217 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-config\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.964239 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.964279 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tr65\" (UniqueName: \"kubernetes.io/projected/94883a00-ab76-403f-8733-9d2413012855-kube-api-access-6tr65\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.964343 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:09 crc kubenswrapper[4765]: I0319 10:41:09.964354 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ddad710-e7a8-4593-82db-bddeef4de69e-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.066187 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-config\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.066254 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.066304 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6tr65\" (UniqueName: \"kubernetes.io/projected/94883a00-ab76-403f-8733-9d2413012855-kube-api-access-6tr65\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.066375 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.066394 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.068768 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-config\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.069140 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-nb\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.069918 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-sb\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.070093 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-dns-svc\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.096228 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tr65\" (UniqueName: \"kubernetes.io/projected/94883a00-ab76-403f-8733-9d2413012855-kube-api-access-6tr65\") pod \"dnsmasq-dns-b8fbc5445-2qgsd\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.114483 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.202380 4765 generic.go:334] "Generic (PLEG): container finished" podID="4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c" containerID="5f13f02cae162b12e1daf9f9d85ecaa6386d6acdeb2214a434274b38d635c0fd" exitCode=0 Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.202456 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gn9lj" event={"ID":"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c","Type":"ContainerDied","Data":"5f13f02cae162b12e1daf9f9d85ecaa6386d6acdeb2214a434274b38d635c0fd"} Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.203921 4765 generic.go:334] "Generic (PLEG): container finished" podID="777c1144-ff82-4ac0-a887-e5859bccf142" containerID="319a3617a109994e7fdcc8484dc49848d8ba20de19e30ee60a87eef1b0301cb5" exitCode=0 Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.204000 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-505a-account-create-update-gp6pz" event={"ID":"777c1144-ff82-4ac0-a887-e5859bccf142","Type":"ContainerDied","Data":"319a3617a109994e7fdcc8484dc49848d8ba20de19e30ee60a87eef1b0301cb5"} Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.218634 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-25h9b" event={"ID":"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5","Type":"ContainerStarted","Data":"313b158ebb4a060fb04a79067bd2ecb1a44ad597821a1dd25fb9fb6470f82820"} Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.245134 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.246634 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-57d769cc4f-n9ml8" event={"ID":"1ddad710-e7a8-4593-82db-bddeef4de69e","Type":"ContainerDied","Data":"eb190c05ec32f135adfebd77dcad2034638933aa7e5f7e0f65a85950d61a7c47"} Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.246682 4765 scope.go:117] "RemoveContainer" containerID="c12291d97997acb9a21d3c96eb756312edf666fedc21d3b338c7f4bdeaa3d361" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.287231 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-metrics-sq8vg"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.312648 4765 scope.go:117] "RemoveContainer" containerID="99a58ef94b3ce25dde495aaf9b57e544ef78cb3150b346c560b7e96695da1821" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.327059 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n9ml8"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.336537 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-57d769cc4f-n9ml8"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.350491 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-create-gftrk"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.394436 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ddad710-e7a8-4593-82db-bddeef4de69e" path="/var/lib/kubelet/pods/1ddad710-e7a8-4593-82db-bddeef4de69e/volumes" Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.442001 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-hfwjh"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.495214 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-eb4b-account-create-update-jtnjd"] Mar 19 10:41:10 crc kubenswrapper[4765]: W0319 10:41:10.514039 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61c50130_b1a8_4d94_99e1_02c523a426a7.slice/crio-37a8778ed894117055c0f38c65d4f4b7ad33cd16d3dc0762754fe10094d79573 WatchSource:0}: Error finding container 37a8778ed894117055c0f38c65d4f4b7ad33cd16d3dc0762754fe10094d79573: Status 404 returned error can't find the container with id 37a8778ed894117055c0f38c65d4f4b7ad33cd16d3dc0762754fe10094d79573 Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.550428 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-r4cpf"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.572807 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-3f4d-account-create-update-qmrd4"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.629194 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-northd-0"] Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.778289 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2qgsd"] Mar 19 10:41:10 crc kubenswrapper[4765]: W0319 10:41:10.801868 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod94883a00_ab76_403f_8733_9d2413012855.slice/crio-a42d636193d071177e8016de59ffcb924d4536cebe07a3ef9cdd32a5080703b1 WatchSource:0}: Error finding container a42d636193d071177e8016de59ffcb924d4536cebe07a3ef9cdd32a5080703b1: Status 404 returned error can't find the container with id a42d636193d071177e8016de59ffcb924d4536cebe07a3ef9cdd32a5080703b1 Mar 19 10:41:10 crc kubenswrapper[4765]: I0319 10:41:10.986209 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-storage-0"] Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.001121 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.007827 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-swift-dockercfg-7pjn9" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.008353 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-conf" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.008732 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-storage-config-data" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.009003 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-files" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.058859 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.099282 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21734dce-e034-473f-a919-7026f837ede2-cache\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.099356 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.099384 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.099444 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4c8qm\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-kube-api-access-4c8qm\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.099488 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21734dce-e034-473f-a919-7026f837ede2-lock\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.099524 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21734dce-e034-473f-a919-7026f837ede2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.200613 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21734dce-e034-473f-a919-7026f837ede2-cache\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.200669 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.200687 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.200729 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4c8qm\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-kube-api-access-4c8qm\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.200763 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21734dce-e034-473f-a919-7026f837ede2-lock\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.200795 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21734dce-e034-473f-a919-7026f837ede2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.202695 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/21734dce-e034-473f-a919-7026f837ede2-cache\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: E0319 10:41:11.210570 4765 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 10:41:11 crc kubenswrapper[4765]: E0319 10:41:11.210894 4765 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 10:41:11 crc kubenswrapper[4765]: E0319 10:41:11.210939 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift podName:21734dce-e034-473f-a919-7026f837ede2 nodeName:}" failed. No retries permitted until 2026-03-19 10:41:11.710923229 +0000 UTC m=+1170.059868771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift") pod "swift-storage-0" (UID: "21734dce-e034-473f-a919-7026f837ede2") : configmap "swift-ring-files" not found Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.211315 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") device mount path \"/mnt/openstack/pv05\"" pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.211926 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/21734dce-e034-473f-a919-7026f837ede2-lock\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.232010 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21734dce-e034-473f-a919-7026f837ede2-combined-ca-bundle\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.275976 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.285702 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4c8qm\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-kube-api-access-4c8qm\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.304013 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"95192c6a-3899-4f62-bfca-47ad91bd17f1","Type":"ContainerStarted","Data":"37d827d68b58c1d46f1dfd3876231eb26de5641f6fb41261f75a1d8fc89214a9"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.310166 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" event={"ID":"61c50130-b1a8-4d94-99e1-02c523a426a7","Type":"ContainerStarted","Data":"37a8778ed894117055c0f38c65d4f4b7ad33cd16d3dc0762754fe10094d79573"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.313765 4765 generic.go:334] "Generic (PLEG): container finished" podID="cb59bf52-1cd4-4d26-9c5d-9ee4561267c5" containerID="1876e2731ada5fcbc3fe82577681e2b96f37a56a78f1c4731475777f13f08b23" exitCode=0 Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.314490 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-25h9b" event={"ID":"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5","Type":"ContainerDied","Data":"1876e2731ada5fcbc3fe82577681e2b96f37a56a78f1c4731475777f13f08b23"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.319073 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" event={"ID":"94883a00-ab76-403f-8733-9d2413012855","Type":"ContainerStarted","Data":"a42d636193d071177e8016de59ffcb924d4536cebe07a3ef9cdd32a5080703b1"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.321672 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3f4d-account-create-update-qmrd4" event={"ID":"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc","Type":"ContainerStarted","Data":"42d5747a0b80f79f343538a6daea1710955b1ade686b916eb415f3186624c37a"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.321702 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3f4d-account-create-update-qmrd4" event={"ID":"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc","Type":"ContainerStarted","Data":"39a6bd60b6f736f897e58d8b6ae9933780fa2312f79be1eb2ce9d8c82f0ac939"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.323768 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-hfwjh" event={"ID":"7227feea-b1ef-4e43-a224-ef1b078fc070","Type":"ContainerStarted","Data":"be52049aec441c57abd7417673d3944592fde2e3e37b65f59ac057939ffd3792"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.325464 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sq8vg" event={"ID":"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb","Type":"ContainerStarted","Data":"56ddcb342e219ba87ad8c4b44544b7b343b0ece745b365065ec6756294dfc88b"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.325488 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-metrics-sq8vg" event={"ID":"c2dd6b2c-bf15-47e4-b9c6-775b176fbadb","Type":"ContainerStarted","Data":"aaa51a846d298a22908d8708ec98c6545f9fa5692b2ad2c93a3973bd6a0a2cff"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.336720 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb4b-account-create-update-jtnjd" event={"ID":"3cca3085-ce1d-43c5-ada0-89d57e6ce578","Type":"ContainerStarted","Data":"ea969d495984141102882c64dee73fec4a6bf28cfc97fd0ba8ed66f28b209303"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.336772 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb4b-account-create-update-jtnjd" event={"ID":"3cca3085-ce1d-43c5-ada0-89d57e6ce578","Type":"ContainerStarted","Data":"d44d1445161bde50237dd375a59d77f95e397ee38066a6ae540f17d548194a11"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.347301 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gftrk" event={"ID":"31509f20-2f87-4039-b2fc-7a65a11e34e8","Type":"ContainerStarted","Data":"25d836aaf60cf437302090a0c8a573627f829543e9d3b69586e24d3139666592"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.347356 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gftrk" event={"ID":"31509f20-2f87-4039-b2fc-7a65a11e34e8","Type":"ContainerStarted","Data":"17ae74cf77e9209f1fac97b64f899958a817169a9d4f0a2f1177559312b1d112"} Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.358092 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-controller-metrics-sq8vg" podStartSLOduration=3.35807188 podStartE2EDuration="3.35807188s" podCreationTimestamp="2026-03-19 10:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:41:11.35698734 +0000 UTC m=+1169.705932882" watchObservedRunningTime="2026-03-19 10:41:11.35807188 +0000 UTC m=+1169.707017422" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.408735 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-3f4d-account-create-update-qmrd4" podStartSLOduration=3.408706313 podStartE2EDuration="3.408706313s" podCreationTimestamp="2026-03-19 10:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:41:11.40159022 +0000 UTC m=+1169.750535762" watchObservedRunningTime="2026-03-19 10:41:11.408706313 +0000 UTC m=+1169.757651855" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.438316 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-eb4b-account-create-update-jtnjd" podStartSLOduration=3.438296826 podStartE2EDuration="3.438296826s" podCreationTimestamp="2026-03-19 10:41:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:41:11.436677942 +0000 UTC m=+1169.785623484" watchObservedRunningTime="2026-03-19 10:41:11.438296826 +0000 UTC m=+1169.787242368" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.625692 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-ring-rebalance-f5w89"] Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.626739 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.629848 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-config-data" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.630011 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"swift-ring-scripts" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.630142 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.644864 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f5w89"] Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.710428 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd78fb4a-24b1-4fb7-8994-3668d29ff042-etc-swift\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.710468 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-ring-data-devices\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.710487 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-dispersionconf\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.710600 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-swiftconf\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.710633 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b8q74\" (UniqueName: \"kubernetes.io/projected/bd78fb4a-24b1-4fb7-8994-3668d29ff042-kube-api-access-b8q74\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.710729 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-scripts\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.710758 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-combined-ca-bundle\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.779675 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.812625 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-operator-scripts\") pod \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.812708 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-52s6l\" (UniqueName: \"kubernetes.io/projected/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-kube-api-access-52s6l\") pod \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\" (UID: \"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c\") " Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813263 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-scripts\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813318 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-combined-ca-bundle\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813367 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd78fb4a-24b1-4fb7-8994-3668d29ff042-etc-swift\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813394 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-ring-data-devices\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813417 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-dispersionconf\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813480 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-swiftconf\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813512 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b8q74\" (UniqueName: \"kubernetes.io/projected/bd78fb4a-24b1-4fb7-8994-3668d29ff042-kube-api-access-b8q74\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.813595 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:11 crc kubenswrapper[4765]: E0319 10:41:11.813867 4765 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 10:41:11 crc kubenswrapper[4765]: E0319 10:41:11.813900 4765 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 10:41:11 crc kubenswrapper[4765]: E0319 10:41:11.813995 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift podName:21734dce-e034-473f-a919-7026f837ede2 nodeName:}" failed. No retries permitted until 2026-03-19 10:41:12.813949854 +0000 UTC m=+1171.162895396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift") pod "swift-storage-0" (UID: "21734dce-e034-473f-a919-7026f837ede2") : configmap "swift-ring-files" not found Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.814175 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-ring-data-devices\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.814198 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c" (UID: "4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.814943 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-scripts\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.815010 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd78fb4a-24b1-4fb7-8994-3668d29ff042-etc-swift\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.821130 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-swiftconf\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.825369 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-kube-api-access-52s6l" (OuterVolumeSpecName: "kube-api-access-52s6l") pod "4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c" (UID: "4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c"). InnerVolumeSpecName "kube-api-access-52s6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.825567 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-dispersionconf\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.828380 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-combined-ca-bundle\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.832864 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b8q74\" (UniqueName: \"kubernetes.io/projected/bd78fb4a-24b1-4fb7-8994-3668d29ff042-kube-api-access-b8q74\") pod \"swift-ring-rebalance-f5w89\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.900890 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.920467 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:11 crc kubenswrapper[4765]: I0319 10:41:11.920803 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-52s6l\" (UniqueName: \"kubernetes.io/projected/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c-kube-api-access-52s6l\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.023968 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2tf8k\" (UniqueName: \"kubernetes.io/projected/777c1144-ff82-4ac0-a887-e5859bccf142-kube-api-access-2tf8k\") pod \"777c1144-ff82-4ac0-a887-e5859bccf142\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.024081 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777c1144-ff82-4ac0-a887-e5859bccf142-operator-scripts\") pod \"777c1144-ff82-4ac0-a887-e5859bccf142\" (UID: \"777c1144-ff82-4ac0-a887-e5859bccf142\") " Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.024972 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/777c1144-ff82-4ac0-a887-e5859bccf142-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "777c1144-ff82-4ac0-a887-e5859bccf142" (UID: "777c1144-ff82-4ac0-a887-e5859bccf142"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.032472 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/777c1144-ff82-4ac0-a887-e5859bccf142-kube-api-access-2tf8k" (OuterVolumeSpecName: "kube-api-access-2tf8k") pod "777c1144-ff82-4ac0-a887-e5859bccf142" (UID: "777c1144-ff82-4ac0-a887-e5859bccf142"). InnerVolumeSpecName "kube-api-access-2tf8k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.032832 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.127486 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2tf8k\" (UniqueName: \"kubernetes.io/projected/777c1144-ff82-4ac0-a887-e5859bccf142-kube-api-access-2tf8k\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.127548 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/777c1144-ff82-4ac0-a887-e5859bccf142-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.407660 4765 generic.go:334] "Generic (PLEG): container finished" podID="31509f20-2f87-4039-b2fc-7a65a11e34e8" containerID="25d836aaf60cf437302090a0c8a573627f829543e9d3b69586e24d3139666592" exitCode=0 Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.408027 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gftrk" event={"ID":"31509f20-2f87-4039-b2fc-7a65a11e34e8","Type":"ContainerDied","Data":"25d836aaf60cf437302090a0c8a573627f829543e9d3b69586e24d3139666592"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.418562 4765 generic.go:334] "Generic (PLEG): container finished" podID="94883a00-ab76-403f-8733-9d2413012855" containerID="239ab186dc20d74ad0b9369dff9ee09519beec1de4ea39f16dd3e86139c99cb6" exitCode=0 Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.418704 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" event={"ID":"94883a00-ab76-403f-8733-9d2413012855","Type":"ContainerDied","Data":"239ab186dc20d74ad0b9369dff9ee09519beec1de4ea39f16dd3e86139c99cb6"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.423148 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-create-gn9lj" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.423109 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-create-gn9lj" event={"ID":"4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c","Type":"ContainerDied","Data":"f81cbe6cb2cef60c299a01f892018ef810434f3bfb6700702498f6e6592d92b3"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.423269 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f81cbe6cb2cef60c299a01f892018ef810434f3bfb6700702498f6e6592d92b3" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.428785 4765 generic.go:334] "Generic (PLEG): container finished" podID="df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc" containerID="42d5747a0b80f79f343538a6daea1710955b1ade686b916eb415f3186624c37a" exitCode=0 Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.428887 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3f4d-account-create-update-qmrd4" event={"ID":"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc","Type":"ContainerDied","Data":"42d5747a0b80f79f343538a6daea1710955b1ade686b916eb415f3186624c37a"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.436766 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-505a-account-create-update-gp6pz" event={"ID":"777c1144-ff82-4ac0-a887-e5859bccf142","Type":"ContainerDied","Data":"735f60d593073012b9cd4e09d42716aa080b8b51ad666afa2054fa50bfd397d6"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.436826 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="735f60d593073012b9cd4e09d42716aa080b8b51ad666afa2054fa50bfd397d6" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.436978 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-505a-account-create-update-gp6pz" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.442000 4765 generic.go:334] "Generic (PLEG): container finished" podID="7227feea-b1ef-4e43-a224-ef1b078fc070" containerID="38b0107116f2938a57ca101e12aab1daa10a976d6725747228d5472fb310b6e7" exitCode=0 Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.442107 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-hfwjh" event={"ID":"7227feea-b1ef-4e43-a224-ef1b078fc070","Type":"ContainerDied","Data":"38b0107116f2938a57ca101e12aab1daa10a976d6725747228d5472fb310b6e7"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.456169 4765 generic.go:334] "Generic (PLEG): container finished" podID="61c50130-b1a8-4d94-99e1-02c523a426a7" containerID="c80b1a9fcddca73752ce670704933e787caf22053f86702b0ffdcdef30162e81" exitCode=0 Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.456469 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" event={"ID":"61c50130-b1a8-4d94-99e1-02c523a426a7","Type":"ContainerDied","Data":"c80b1a9fcddca73752ce670704933e787caf22053f86702b0ffdcdef30162e81"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.475067 4765 generic.go:334] "Generic (PLEG): container finished" podID="3cca3085-ce1d-43c5-ada0-89d57e6ce578" containerID="ea969d495984141102882c64dee73fec4a6bf28cfc97fd0ba8ed66f28b209303" exitCode=0 Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.475974 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb4b-account-create-update-jtnjd" event={"ID":"3cca3085-ce1d-43c5-ada0-89d57e6ce578","Type":"ContainerDied","Data":"ea969d495984141102882c64dee73fec4a6bf28cfc97fd0ba8ed66f28b209303"} Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.722389 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-ring-rebalance-f5w89"] Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.771853 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gftrk" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.858623 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7r58\" (UniqueName: \"kubernetes.io/projected/31509f20-2f87-4039-b2fc-7a65a11e34e8-kube-api-access-w7r58\") pod \"31509f20-2f87-4039-b2fc-7a65a11e34e8\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.858746 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31509f20-2f87-4039-b2fc-7a65a11e34e8-operator-scripts\") pod \"31509f20-2f87-4039-b2fc-7a65a11e34e8\" (UID: \"31509f20-2f87-4039-b2fc-7a65a11e34e8\") " Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.859244 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.859430 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31509f20-2f87-4039-b2fc-7a65a11e34e8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "31509f20-2f87-4039-b2fc-7a65a11e34e8" (UID: "31509f20-2f87-4039-b2fc-7a65a11e34e8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:12 crc kubenswrapper[4765]: E0319 10:41:12.859525 4765 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 10:41:12 crc kubenswrapper[4765]: E0319 10:41:12.859730 4765 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 10:41:12 crc kubenswrapper[4765]: E0319 10:41:12.859787 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift podName:21734dce-e034-473f-a919-7026f837ede2 nodeName:}" failed. No retries permitted until 2026-03-19 10:41:14.859765568 +0000 UTC m=+1173.208711110 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift") pod "swift-storage-0" (UID: "21734dce-e034-473f-a919-7026f837ede2") : configmap "swift-ring-files" not found Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.891427 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31509f20-2f87-4039-b2fc-7a65a11e34e8-kube-api-access-w7r58" (OuterVolumeSpecName: "kube-api-access-w7r58") pod "31509f20-2f87-4039-b2fc-7a65a11e34e8" (UID: "31509f20-2f87-4039-b2fc-7a65a11e34e8"). InnerVolumeSpecName "kube-api-access-w7r58". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.961797 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7r58\" (UniqueName: \"kubernetes.io/projected/31509f20-2f87-4039-b2fc-7a65a11e34e8-kube-api-access-w7r58\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:12 crc kubenswrapper[4765]: I0319 10:41:12.961835 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/31509f20-2f87-4039-b2fc-7a65a11e34e8-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.049421 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.164453 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.165214 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-config\") pod \"7227feea-b1ef-4e43-a224-ef1b078fc070\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.165328 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-dns-svc\") pod \"7227feea-b1ef-4e43-a224-ef1b078fc070\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.165381 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b5c6c\" (UniqueName: \"kubernetes.io/projected/7227feea-b1ef-4e43-a224-ef1b078fc070-kube-api-access-b5c6c\") pod \"7227feea-b1ef-4e43-a224-ef1b078fc070\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.165453 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-nb\") pod \"7227feea-b1ef-4e43-a224-ef1b078fc070\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.165532 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-sb\") pod \"7227feea-b1ef-4e43-a224-ef1b078fc070\" (UID: \"7227feea-b1ef-4e43-a224-ef1b078fc070\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.184406 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7227feea-b1ef-4e43-a224-ef1b078fc070-kube-api-access-b5c6c" (OuterVolumeSpecName: "kube-api-access-b5c6c") pod "7227feea-b1ef-4e43-a224-ef1b078fc070" (UID: "7227feea-b1ef-4e43-a224-ef1b078fc070"). InnerVolumeSpecName "kube-api-access-b5c6c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.192272 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.194135 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-config" (OuterVolumeSpecName: "config") pod "7227feea-b1ef-4e43-a224-ef1b078fc070" (UID: "7227feea-b1ef-4e43-a224-ef1b078fc070"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.219818 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "7227feea-b1ef-4e43-a224-ef1b078fc070" (UID: "7227feea-b1ef-4e43-a224-ef1b078fc070"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.221022 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "7227feea-b1ef-4e43-a224-ef1b078fc070" (UID: "7227feea-b1ef-4e43-a224-ef1b078fc070"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.226442 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "7227feea-b1ef-4e43-a224-ef1b078fc070" (UID: "7227feea-b1ef-4e43-a224-ef1b078fc070"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.266512 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsskj\" (UniqueName: \"kubernetes.io/projected/61c50130-b1a8-4d94-99e1-02c523a426a7-kube-api-access-hsskj\") pod \"61c50130-b1a8-4d94-99e1-02c523a426a7\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.266564 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-config\") pod \"61c50130-b1a8-4d94-99e1-02c523a426a7\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.266609 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4rdx\" (UniqueName: \"kubernetes.io/projected/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-kube-api-access-w4rdx\") pod \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.267681 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-ovsdbserver-nb\") pod \"61c50130-b1a8-4d94-99e1-02c523a426a7\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.267713 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-operator-scripts\") pod \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\" (UID: \"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.267768 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-dns-svc\") pod \"61c50130-b1a8-4d94-99e1-02c523a426a7\" (UID: \"61c50130-b1a8-4d94-99e1-02c523a426a7\") " Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.268761 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "cb59bf52-1cd4-4d26-9c5d-9ee4561267c5" (UID: "cb59bf52-1cd4-4d26-9c5d-9ee4561267c5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.268876 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.268920 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b5c6c\" (UniqueName: \"kubernetes.io/projected/7227feea-b1ef-4e43-a224-ef1b078fc070-kube-api-access-b5c6c\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.268972 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.268985 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.268997 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7227feea-b1ef-4e43-a224-ef1b078fc070-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.270818 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-kube-api-access-w4rdx" (OuterVolumeSpecName: "kube-api-access-w4rdx") pod "cb59bf52-1cd4-4d26-9c5d-9ee4561267c5" (UID: "cb59bf52-1cd4-4d26-9c5d-9ee4561267c5"). InnerVolumeSpecName "kube-api-access-w4rdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.271306 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61c50130-b1a8-4d94-99e1-02c523a426a7-kube-api-access-hsskj" (OuterVolumeSpecName: "kube-api-access-hsskj") pod "61c50130-b1a8-4d94-99e1-02c523a426a7" (UID: "61c50130-b1a8-4d94-99e1-02c523a426a7"). InnerVolumeSpecName "kube-api-access-hsskj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.284036 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-config" (OuterVolumeSpecName: "config") pod "61c50130-b1a8-4d94-99e1-02c523a426a7" (UID: "61c50130-b1a8-4d94-99e1-02c523a426a7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.291685 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "61c50130-b1a8-4d94-99e1-02c523a426a7" (UID: "61c50130-b1a8-4d94-99e1-02c523a426a7"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.298277 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "61c50130-b1a8-4d94-99e1-02c523a426a7" (UID: "61c50130-b1a8-4d94-99e1-02c523a426a7"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.370917 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.370977 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4rdx\" (UniqueName: \"kubernetes.io/projected/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-kube-api-access-w4rdx\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.370995 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.371004 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.371012 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/61c50130-b1a8-4d94-99e1-02c523a426a7-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.371020 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hsskj\" (UniqueName: \"kubernetes.io/projected/61c50130-b1a8-4d94-99e1-02c523a426a7-kube-api-access-hsskj\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.494636 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-create-25h9b" event={"ID":"cb59bf52-1cd4-4d26-9c5d-9ee4561267c5","Type":"ContainerDied","Data":"313b158ebb4a060fb04a79067bd2ecb1a44ad597821a1dd25fb9fb6470f82820"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.494681 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="313b158ebb4a060fb04a79067bd2ecb1a44ad597821a1dd25fb9fb6470f82820" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.494744 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-create-25h9b" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.496100 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f5w89" event={"ID":"bd78fb4a-24b1-4fb7-8994-3668d29ff042","Type":"ContainerStarted","Data":"2e5fb79704b20250dff8d5f96c45f80c308a9f5b8c4813e8df1ef59631cb24fb"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.497867 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-create-gftrk" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.497877 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-create-gftrk" event={"ID":"31509f20-2f87-4039-b2fc-7a65a11e34e8","Type":"ContainerDied","Data":"17ae74cf77e9209f1fac97b64f899958a817169a9d4f0a2f1177559312b1d112"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.497951 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="17ae74cf77e9209f1fac97b64f899958a817169a9d4f0a2f1177559312b1d112" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.501090 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" event={"ID":"94883a00-ab76-403f-8733-9d2413012855","Type":"ContainerStarted","Data":"12f794c628abcf06c7915cdc0ed4ebc02684fff5d50145b6e69b6f22c1da6822"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.501252 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.502863 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"95192c6a-3899-4f62-bfca-47ad91bd17f1","Type":"ContainerStarted","Data":"1c358c43de8d088bcabdf297fd9542df622b1445cb65f2ae53da8f3091fe815d"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.502894 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-northd-0" event={"ID":"95192c6a-3899-4f62-bfca-47ad91bd17f1","Type":"ContainerStarted","Data":"5b13ea9b614fa6a215fe2e36cf6479e10ad329b4764452c94a32a75d56678d2a"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.503410 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ovn-northd-0" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.511634 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-8554648995-hfwjh" event={"ID":"7227feea-b1ef-4e43-a224-ef1b078fc070","Type":"ContainerDied","Data":"be52049aec441c57abd7417673d3944592fde2e3e37b65f59ac057939ffd3792"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.511680 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-8554648995-hfwjh" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.511699 4765 scope.go:117] "RemoveContainer" containerID="38b0107116f2938a57ca101e12aab1daa10a976d6725747228d5472fb310b6e7" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.525678 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.525726 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5bf47b49b7-r4cpf" event={"ID":"61c50130-b1a8-4d94-99e1-02c523a426a7","Type":"ContainerDied","Data":"37a8778ed894117055c0f38c65d4f4b7ad33cd16d3dc0762754fe10094d79573"} Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.566201 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" podStartSLOduration=4.566173997 podStartE2EDuration="4.566173997s" podCreationTimestamp="2026-03-19 10:41:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:41:13.525500014 +0000 UTC m=+1171.874445566" watchObservedRunningTime="2026-03-19 10:41:13.566173997 +0000 UTC m=+1171.915119539" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.569880 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-northd-0" podStartSLOduration=2.970261312 podStartE2EDuration="4.569861507s" podCreationTimestamp="2026-03-19 10:41:09 +0000 UTC" firstStartedPulling="2026-03-19 10:41:10.6951617 +0000 UTC m=+1169.044107242" lastFinishedPulling="2026-03-19 10:41:12.294761895 +0000 UTC m=+1170.643707437" observedRunningTime="2026-03-19 10:41:13.556008892 +0000 UTC m=+1171.904954464" watchObservedRunningTime="2026-03-19 10:41:13.569861507 +0000 UTC m=+1171.918807049" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.580260 4765 scope.go:117] "RemoveContainer" containerID="c80b1a9fcddca73752ce670704933e787caf22053f86702b0ffdcdef30162e81" Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.620613 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-r4cpf"] Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.633319 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5bf47b49b7-r4cpf"] Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.690214 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-8554648995-hfwjh"] Mar 19 10:41:13 crc kubenswrapper[4765]: I0319 10:41:13.707434 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-8554648995-hfwjh"] Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.124680 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.146588 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.193056 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cca3085-ce1d-43c5-ada0-89d57e6ce578-operator-scripts\") pod \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.193241 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fxgw5\" (UniqueName: \"kubernetes.io/projected/3cca3085-ce1d-43c5-ada0-89d57e6ce578-kube-api-access-fxgw5\") pod \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\" (UID: \"3cca3085-ce1d-43c5-ada0-89d57e6ce578\") " Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.194535 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cca3085-ce1d-43c5-ada0-89d57e6ce578-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "3cca3085-ce1d-43c5-ada0-89d57e6ce578" (UID: "3cca3085-ce1d-43c5-ada0-89d57e6ce578"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.196947 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p4px4\" (UniqueName: \"kubernetes.io/projected/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-kube-api-access-p4px4\") pod \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.197138 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-operator-scripts\") pod \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\" (UID: \"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc\") " Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.197709 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc" (UID: "df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.197803 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/3cca3085-ce1d-43c5-ada0-89d57e6ce578-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.203458 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cca3085-ce1d-43c5-ada0-89d57e6ce578-kube-api-access-fxgw5" (OuterVolumeSpecName: "kube-api-access-fxgw5") pod "3cca3085-ce1d-43c5-ada0-89d57e6ce578" (UID: "3cca3085-ce1d-43c5-ada0-89d57e6ce578"). InnerVolumeSpecName "kube-api-access-fxgw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.214842 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-kube-api-access-p4px4" (OuterVolumeSpecName: "kube-api-access-p4px4") pod "df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc" (UID: "df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc"). InnerVolumeSpecName "kube-api-access-p4px4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.301241 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fxgw5\" (UniqueName: \"kubernetes.io/projected/3cca3085-ce1d-43c5-ada0-89d57e6ce578-kube-api-access-fxgw5\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.301277 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p4px4\" (UniqueName: \"kubernetes.io/projected/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-kube-api-access-p4px4\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.301288 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.352688 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-8sh2p"] Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353053 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb59bf52-1cd4-4d26-9c5d-9ee4561267c5" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353068 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb59bf52-1cd4-4d26-9c5d-9ee4561267c5" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353093 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353098 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353116 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="777c1144-ff82-4ac0-a887-e5859bccf142" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353124 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="777c1144-ff82-4ac0-a887-e5859bccf142" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353137 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7227feea-b1ef-4e43-a224-ef1b078fc070" containerName="init" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353142 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7227feea-b1ef-4e43-a224-ef1b078fc070" containerName="init" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353149 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353154 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353167 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3cca3085-ce1d-43c5-ada0-89d57e6ce578" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353173 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3cca3085-ce1d-43c5-ada0-89d57e6ce578" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353183 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61c50130-b1a8-4d94-99e1-02c523a426a7" containerName="init" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353189 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="61c50130-b1a8-4d94-99e1-02c523a426a7" containerName="init" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.353202 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="31509f20-2f87-4039-b2fc-7a65a11e34e8" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353207 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="31509f20-2f87-4039-b2fc-7a65a11e34e8" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353368 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="777c1144-ff82-4ac0-a887-e5859bccf142" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353386 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353395 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7227feea-b1ef-4e43-a224-ef1b078fc070" containerName="init" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353404 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353417 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="61c50130-b1a8-4d94-99e1-02c523a426a7" containerName="init" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353426 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="31509f20-2f87-4039-b2fc-7a65a11e34e8" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353437 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb59bf52-1cd4-4d26-9c5d-9ee4561267c5" containerName="mariadb-database-create" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353449 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3cca3085-ce1d-43c5-ada0-89d57e6ce578" containerName="mariadb-account-create-update" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.353994 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.373330 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-mariadb-root-db-secret" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.374333 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61c50130-b1a8-4d94-99e1-02c523a426a7" path="/var/lib/kubelet/pods/61c50130-b1a8-4d94-99e1-02c523a426a7/volumes" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.374865 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7227feea-b1ef-4e43-a224-ef1b078fc070" path="/var/lib/kubelet/pods/7227feea-b1ef-4e43-a224-ef1b078fc070/volumes" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.379308 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8sh2p"] Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.402518 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2462b80-a766-4393-97c4-cc95e1d4b8b4-operator-scripts\") pod \"root-account-create-update-8sh2p\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.402677 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jdss7\" (UniqueName: \"kubernetes.io/projected/f2462b80-a766-4393-97c4-cc95e1d4b8b4-kube-api-access-jdss7\") pod \"root-account-create-update-8sh2p\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.504678 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2462b80-a766-4393-97c4-cc95e1d4b8b4-operator-scripts\") pod \"root-account-create-update-8sh2p\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.505558 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2462b80-a766-4393-97c4-cc95e1d4b8b4-operator-scripts\") pod \"root-account-create-update-8sh2p\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.505794 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jdss7\" (UniqueName: \"kubernetes.io/projected/f2462b80-a766-4393-97c4-cc95e1d4b8b4-kube-api-access-jdss7\") pod \"root-account-create-update-8sh2p\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.535468 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jdss7\" (UniqueName: \"kubernetes.io/projected/f2462b80-a766-4393-97c4-cc95e1d4b8b4-kube-api-access-jdss7\") pod \"root-account-create-update-8sh2p\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.545743 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-3f4d-account-create-update-qmrd4" event={"ID":"df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc","Type":"ContainerDied","Data":"39a6bd60b6f736f897e58d8b6ae9933780fa2312f79be1eb2ce9d8c82f0ac939"} Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.545791 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="39a6bd60b6f736f897e58d8b6ae9933780fa2312f79be1eb2ce9d8c82f0ac939" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.545861 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-3f4d-account-create-update-qmrd4" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.560424 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-eb4b-account-create-update-jtnjd" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.560845 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-eb4b-account-create-update-jtnjd" event={"ID":"3cca3085-ce1d-43c5-ada0-89d57e6ce578","Type":"ContainerDied","Data":"d44d1445161bde50237dd375a59d77f95e397ee38066a6ae540f17d548194a11"} Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.560873 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d44d1445161bde50237dd375a59d77f95e397ee38066a6ae540f17d548194a11" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.687287 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:14 crc kubenswrapper[4765]: I0319 10:41:14.925255 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.925517 4765 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.925549 4765 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 10:41:14 crc kubenswrapper[4765]: E0319 10:41:14.925626 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift podName:21734dce-e034-473f-a919-7026f837ede2 nodeName:}" failed. No retries permitted until 2026-03-19 10:41:18.925602848 +0000 UTC m=+1177.274548400 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift") pod "swift-storage-0" (UID: "21734dce-e034-473f-a919-7026f837ede2") : configmap "swift-ring-files" not found Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.526756 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-8sh2p"] Mar 19 10:41:17 crc kubenswrapper[4765]: W0319 10:41:17.539011 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf2462b80_a766_4393_97c4_cc95e1d4b8b4.slice/crio-7c9f84f72ecff70d1255165c3077d3e9cfeb58acfd1697f3da91b3bea82cbace WatchSource:0}: Error finding container 7c9f84f72ecff70d1255165c3077d3e9cfeb58acfd1697f3da91b3bea82cbace: Status 404 returned error can't find the container with id 7c9f84f72ecff70d1255165c3077d3e9cfeb58acfd1697f3da91b3bea82cbace Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.584617 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8sh2p" event={"ID":"f2462b80-a766-4393-97c4-cc95e1d4b8b4","Type":"ContainerStarted","Data":"7c9f84f72ecff70d1255165c3077d3e9cfeb58acfd1697f3da91b3bea82cbace"} Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.586166 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f5w89" event={"ID":"bd78fb4a-24b1-4fb7-8994-3668d29ff042","Type":"ContainerStarted","Data":"9a2c56ff8d06c219960ed249c4e5e26a0875149de794946a9943b5cc9a42c608"} Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.604545 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-ring-rebalance-f5w89" podStartSLOduration=2.254813953 podStartE2EDuration="6.604526946s" podCreationTimestamp="2026-03-19 10:41:11 +0000 UTC" firstStartedPulling="2026-03-19 10:41:12.747043081 +0000 UTC m=+1171.095988623" lastFinishedPulling="2026-03-19 10:41:17.096756074 +0000 UTC m=+1175.445701616" observedRunningTime="2026-03-19 10:41:17.602547602 +0000 UTC m=+1175.951493144" watchObservedRunningTime="2026-03-19 10:41:17.604526946 +0000 UTC m=+1175.953472488" Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.907817 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-db-sync-ds8t6"] Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.909186 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.911487 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-config-data" Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.911804 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xrfsf" Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.922212 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ds8t6"] Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.992982 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-combined-ca-bundle\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.993037 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-config-data\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.993081 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-db-sync-config-data\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:17 crc kubenswrapper[4765]: I0319 10:41:17.993219 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vg4kv\" (UniqueName: \"kubernetes.io/projected/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-kube-api-access-vg4kv\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.094472 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-combined-ca-bundle\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.094551 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-config-data\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.094617 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-db-sync-config-data\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.094643 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vg4kv\" (UniqueName: \"kubernetes.io/projected/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-kube-api-access-vg4kv\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.102136 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-combined-ca-bundle\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.104452 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-db-sync-config-data\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.108578 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-config-data\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.112609 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vg4kv\" (UniqueName: \"kubernetes.io/projected/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-kube-api-access-vg4kv\") pod \"glance-db-sync-ds8t6\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.227360 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.596308 4765 generic.go:334] "Generic (PLEG): container finished" podID="f2462b80-a766-4393-97c4-cc95e1d4b8b4" containerID="919991393c196c5abb9575f97ee0463433c55869ae3b1a020861c0aab1a44f58" exitCode=0 Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.596397 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8sh2p" event={"ID":"f2462b80-a766-4393-97c4-cc95e1d4b8b4","Type":"ContainerDied","Data":"919991393c196c5abb9575f97ee0463433c55869ae3b1a020861c0aab1a44f58"} Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.871166 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-db-sync-ds8t6"] Mar 19 10:41:18 crc kubenswrapper[4765]: W0319 10:41:18.877983 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06ef9738_82a7_4b48_88c4_ef7fa8ee3cf0.slice/crio-075ce43fd7a2462768ad0c53fa07b989bd02bd483991f5dd0f01f9ac29344f4d WatchSource:0}: Error finding container 075ce43fd7a2462768ad0c53fa07b989bd02bd483991f5dd0f01f9ac29344f4d: Status 404 returned error can't find the container with id 075ce43fd7a2462768ad0c53fa07b989bd02bd483991f5dd0f01f9ac29344f4d Mar 19 10:41:18 crc kubenswrapper[4765]: I0319 10:41:18.936072 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:18 crc kubenswrapper[4765]: E0319 10:41:18.936274 4765 projected.go:288] Couldn't get configMap openstack/swift-ring-files: configmap "swift-ring-files" not found Mar 19 10:41:18 crc kubenswrapper[4765]: E0319 10:41:18.936310 4765 projected.go:194] Error preparing data for projected volume etc-swift for pod openstack/swift-storage-0: configmap "swift-ring-files" not found Mar 19 10:41:18 crc kubenswrapper[4765]: E0319 10:41:18.936384 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift podName:21734dce-e034-473f-a919-7026f837ede2 nodeName:}" failed. No retries permitted until 2026-03-19 10:41:26.936362898 +0000 UTC m=+1185.285308440 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift") pod "swift-storage-0" (UID: "21734dce-e034-473f-a919-7026f837ede2") : configmap "swift-ring-files" not found Mar 19 10:41:19 crc kubenswrapper[4765]: I0319 10:41:19.606242 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ds8t6" event={"ID":"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0","Type":"ContainerStarted","Data":"075ce43fd7a2462768ad0c53fa07b989bd02bd483991f5dd0f01f9ac29344f4d"} Mar 19 10:41:19 crc kubenswrapper[4765]: I0319 10:41:19.948000 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.056103 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2462b80-a766-4393-97c4-cc95e1d4b8b4-operator-scripts\") pod \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.056389 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jdss7\" (UniqueName: \"kubernetes.io/projected/f2462b80-a766-4393-97c4-cc95e1d4b8b4-kube-api-access-jdss7\") pod \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\" (UID: \"f2462b80-a766-4393-97c4-cc95e1d4b8b4\") " Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.058601 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f2462b80-a766-4393-97c4-cc95e1d4b8b4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "f2462b80-a766-4393-97c4-cc95e1d4b8b4" (UID: "f2462b80-a766-4393-97c4-cc95e1d4b8b4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.063092 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2462b80-a766-4393-97c4-cc95e1d4b8b4-kube-api-access-jdss7" (OuterVolumeSpecName: "kube-api-access-jdss7") pod "f2462b80-a766-4393-97c4-cc95e1d4b8b4" (UID: "f2462b80-a766-4393-97c4-cc95e1d4b8b4"). InnerVolumeSpecName "kube-api-access-jdss7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.116139 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.161936 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jdss7\" (UniqueName: \"kubernetes.io/projected/f2462b80-a766-4393-97c4-cc95e1d4b8b4-kube-api-access-jdss7\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.162334 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/f2462b80-a766-4393-97c4-cc95e1d4b8b4-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.183627 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nq26n"] Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.183861 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" podUID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerName="dnsmasq-dns" containerID="cri-o://ac17fbc2f13baa25f279df75a69dd2bc1c669cc8be0926e3356cd36013cd3c4b" gracePeriod=10 Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.621763 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-8sh2p" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.621762 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-8sh2p" event={"ID":"f2462b80-a766-4393-97c4-cc95e1d4b8b4","Type":"ContainerDied","Data":"7c9f84f72ecff70d1255165c3077d3e9cfeb58acfd1697f3da91b3bea82cbace"} Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.621993 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7c9f84f72ecff70d1255165c3077d3e9cfeb58acfd1697f3da91b3bea82cbace" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.637561 4765 generic.go:334] "Generic (PLEG): container finished" podID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerID="ac17fbc2f13baa25f279df75a69dd2bc1c669cc8be0926e3356cd36013cd3c4b" exitCode=0 Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.637613 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" event={"ID":"cf28f1ab-e0a6-4481-bb82-4cd47321520a","Type":"ContainerDied","Data":"ac17fbc2f13baa25f279df75a69dd2bc1c669cc8be0926e3356cd36013cd3c4b"} Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.756103 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.889499 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-dns-svc\") pod \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.889596 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-config\") pod \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.889688 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg8pr\" (UniqueName: \"kubernetes.io/projected/cf28f1ab-e0a6-4481-bb82-4cd47321520a-kube-api-access-qg8pr\") pod \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\" (UID: \"cf28f1ab-e0a6-4481-bb82-4cd47321520a\") " Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.902310 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cf28f1ab-e0a6-4481-bb82-4cd47321520a-kube-api-access-qg8pr" (OuterVolumeSpecName: "kube-api-access-qg8pr") pod "cf28f1ab-e0a6-4481-bb82-4cd47321520a" (UID: "cf28f1ab-e0a6-4481-bb82-4cd47321520a"). InnerVolumeSpecName "kube-api-access-qg8pr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.933468 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-config" (OuterVolumeSpecName: "config") pod "cf28f1ab-e0a6-4481-bb82-4cd47321520a" (UID: "cf28f1ab-e0a6-4481-bb82-4cd47321520a"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.936348 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cf28f1ab-e0a6-4481-bb82-4cd47321520a" (UID: "cf28f1ab-e0a6-4481-bb82-4cd47321520a"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.992006 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.992039 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cf28f1ab-e0a6-4481-bb82-4cd47321520a-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:20 crc kubenswrapper[4765]: I0319 10:41:20.992049 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg8pr\" (UniqueName: \"kubernetes.io/projected/cf28f1ab-e0a6-4481-bb82-4cd47321520a-kube-api-access-qg8pr\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:21 crc kubenswrapper[4765]: I0319 10:41:21.649717 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" event={"ID":"cf28f1ab-e0a6-4481-bb82-4cd47321520a","Type":"ContainerDied","Data":"aa249d54c0f68ff5a300b8504492d8418d90cd1cab85af2463da9e71578e9e1e"} Mar 19 10:41:21 crc kubenswrapper[4765]: I0319 10:41:21.649771 4765 scope.go:117] "RemoveContainer" containerID="ac17fbc2f13baa25f279df75a69dd2bc1c669cc8be0926e3356cd36013cd3c4b" Mar 19 10:41:21 crc kubenswrapper[4765]: I0319 10:41:21.649894 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5ccc8479f9-nq26n" Mar 19 10:41:21 crc kubenswrapper[4765]: I0319 10:41:21.686946 4765 scope.go:117] "RemoveContainer" containerID="78d94cb29f97abba53b5187bbe469bc78608c941973177a633b4f7b638b353c8" Mar 19 10:41:21 crc kubenswrapper[4765]: I0319 10:41:21.691557 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nq26n"] Mar 19 10:41:21 crc kubenswrapper[4765]: I0319 10:41:21.697628 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5ccc8479f9-nq26n"] Mar 19 10:41:22 crc kubenswrapper[4765]: I0319 10:41:22.376394 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" path="/var/lib/kubelet/pods/cf28f1ab-e0a6-4481-bb82-4cd47321520a/volumes" Mar 19 10:41:25 crc kubenswrapper[4765]: I0319 10:41:25.765384 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-8sh2p"] Mar 19 10:41:25 crc kubenswrapper[4765]: I0319 10:41:25.771886 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-8sh2p"] Mar 19 10:41:26 crc kubenswrapper[4765]: I0319 10:41:26.378766 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2462b80-a766-4393-97c4-cc95e1d4b8b4" path="/var/lib/kubelet/pods/f2462b80-a766-4393-97c4-cc95e1d4b8b4/volumes" Mar 19 10:41:26 crc kubenswrapper[4765]: I0319 10:41:26.700185 4765 generic.go:334] "Generic (PLEG): container finished" podID="bd78fb4a-24b1-4fb7-8994-3668d29ff042" containerID="9a2c56ff8d06c219960ed249c4e5e26a0875149de794946a9943b5cc9a42c608" exitCode=0 Mar 19 10:41:26 crc kubenswrapper[4765]: I0319 10:41:26.700250 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f5w89" event={"ID":"bd78fb4a-24b1-4fb7-8994-3668d29ff042","Type":"ContainerDied","Data":"9a2c56ff8d06c219960ed249c4e5e26a0875149de794946a9943b5cc9a42c608"} Mar 19 10:41:27 crc kubenswrapper[4765]: I0319 10:41:27.020993 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:27 crc kubenswrapper[4765]: I0319 10:41:27.047662 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/21734dce-e034-473f-a919-7026f837ede2-etc-swift\") pod \"swift-storage-0\" (UID: \"21734dce-e034-473f-a919-7026f837ede2\") " pod="openstack/swift-storage-0" Mar 19 10:41:27 crc kubenswrapper[4765]: I0319 10:41:27.103626 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-storage-0" Mar 19 10:41:28 crc kubenswrapper[4765]: I0319 10:41:28.725697 4765 generic.go:334] "Generic (PLEG): container finished" podID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerID="ede7c513464d77c5408931eb19bc0afaf30f0f27c91691fc636041f35a22d68b" exitCode=0 Mar 19 10:41:28 crc kubenswrapper[4765]: I0319 10:41:28.725795 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ccdb0a31-8b87-4024-848f-efebcf46e604","Type":"ContainerDied","Data":"ede7c513464d77c5408931eb19bc0afaf30f0f27c91691fc636041f35a22d68b"} Mar 19 10:41:29 crc kubenswrapper[4765]: I0319 10:41:29.761677 4765 generic.go:334] "Generic (PLEG): container finished" podID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerID="211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12" exitCode=0 Mar 19 10:41:29 crc kubenswrapper[4765]: I0319 10:41:29.761748 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2190a046-0d52-49c7-b2fd-aa113c2f3f99","Type":"ContainerDied","Data":"211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12"} Mar 19 10:41:29 crc kubenswrapper[4765]: I0319 10:41:29.954982 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-northd-0" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.772444 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/root-account-create-update-zqlg2"] Mar 19 10:41:30 crc kubenswrapper[4765]: E0319 10:41:30.773076 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f2462b80-a766-4393-97c4-cc95e1d4b8b4" containerName="mariadb-account-create-update" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.773089 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2462b80-a766-4393-97c4-cc95e1d4b8b4" containerName="mariadb-account-create-update" Mar 19 10:41:30 crc kubenswrapper[4765]: E0319 10:41:30.773110 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerName="dnsmasq-dns" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.773116 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerName="dnsmasq-dns" Mar 19 10:41:30 crc kubenswrapper[4765]: E0319 10:41:30.773133 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerName="init" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.773139 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerName="init" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.773328 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f2462b80-a766-4393-97c4-cc95e1d4b8b4" containerName="mariadb-account-create-update" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.773360 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="cf28f1ab-e0a6-4481-bb82-4cd47321520a" containerName="dnsmasq-dns" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.774180 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.781278 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-cell1-mariadb-root-db-secret" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.784472 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zqlg2"] Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.931401 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-operator-scripts\") pod \"root-account-create-update-zqlg2\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:30 crc kubenswrapper[4765]: I0319 10:41:30.931490 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hc9k4\" (UniqueName: \"kubernetes.io/projected/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-kube-api-access-hc9k4\") pod \"root-account-create-update-zqlg2\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:31 crc kubenswrapper[4765]: I0319 10:41:31.033782 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-operator-scripts\") pod \"root-account-create-update-zqlg2\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:31 crc kubenswrapper[4765]: I0319 10:41:31.033882 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hc9k4\" (UniqueName: \"kubernetes.io/projected/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-kube-api-access-hc9k4\") pod \"root-account-create-update-zqlg2\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:31 crc kubenswrapper[4765]: I0319 10:41:31.034616 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-operator-scripts\") pod \"root-account-create-update-zqlg2\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:31 crc kubenswrapper[4765]: I0319 10:41:31.062042 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hc9k4\" (UniqueName: \"kubernetes.io/projected/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-kube-api-access-hc9k4\") pod \"root-account-create-update-zqlg2\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:31 crc kubenswrapper[4765]: I0319 10:41:31.137861 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:32 crc kubenswrapper[4765]: I0319 10:41:32.819064 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ovn-controller-ct9xj" podUID="5272132e-561c-46b9-92c8-1714e40b3303" containerName="ovn-controller" probeResult="failure" output=< Mar 19 10:41:32 crc kubenswrapper[4765]: ERROR - ovn-controller connection status is 'not connected', expecting 'connected' status Mar 19 10:41:32 crc kubenswrapper[4765]: > Mar 19 10:41:32 crc kubenswrapper[4765]: I0319 10:41:32.864069 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:41:32 crc kubenswrapper[4765]: I0319 10:41:32.879298 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ovs-bmbgn" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.106818 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ct9xj-config-rplgt"] Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.108279 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.112379 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.116866 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ct9xj-config-rplgt"] Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.278184 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-scripts\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.278354 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-log-ovn\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.278401 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qskjz\" (UniqueName: \"kubernetes.io/projected/b6958393-8685-4df8-a7fb-bdd6a695a409-kube-api-access-qskjz\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.278444 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run-ovn\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.278507 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.278540 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-additional-scripts\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.380940 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qskjz\" (UniqueName: \"kubernetes.io/projected/b6958393-8685-4df8-a7fb-bdd6a695a409-kube-api-access-qskjz\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381028 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run-ovn\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381089 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381119 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-additional-scripts\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381182 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-scripts\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381263 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-log-ovn\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381564 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-log-ovn\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381564 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.381607 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run-ovn\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.382431 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-additional-scripts\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.383763 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-scripts\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.399297 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qskjz\" (UniqueName: \"kubernetes.io/projected/b6958393-8685-4df8-a7fb-bdd6a695a409-kube-api-access-qskjz\") pod \"ovn-controller-ct9xj-config-rplgt\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.478074 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.816942 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-ring-rebalance-f5w89" event={"ID":"bd78fb4a-24b1-4fb7-8994-3668d29ff042","Type":"ContainerDied","Data":"2e5fb79704b20250dff8d5f96c45f80c308a9f5b8c4813e8df1ef59631cb24fb"} Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.817013 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2e5fb79704b20250dff8d5f96c45f80c308a9f5b8c4813e8df1ef59631cb24fb" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.848821 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.992466 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-ring-data-devices\") pod \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.992534 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b8q74\" (UniqueName: \"kubernetes.io/projected/bd78fb4a-24b1-4fb7-8994-3668d29ff042-kube-api-access-b8q74\") pod \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.992622 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-swiftconf\") pod \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.992818 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-dispersionconf\") pod \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.992851 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-scripts\") pod \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.992916 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd78fb4a-24b1-4fb7-8994-3668d29ff042-etc-swift\") pod \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.992937 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-combined-ca-bundle\") pod \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\" (UID: \"bd78fb4a-24b1-4fb7-8994-3668d29ff042\") " Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.994791 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bd78fb4a-24b1-4fb7-8994-3668d29ff042-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bd78fb4a-24b1-4fb7-8994-3668d29ff042" (UID: "bd78fb4a-24b1-4fb7-8994-3668d29ff042"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.994853 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "bd78fb4a-24b1-4fb7-8994-3668d29ff042" (UID: "bd78fb4a-24b1-4fb7-8994-3668d29ff042"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.995353 4765 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-ring-data-devices\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:33 crc kubenswrapper[4765]: I0319 10:41:33.995376 4765 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/bd78fb4a-24b1-4fb7-8994-3668d29ff042-etc-swift\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.001399 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd78fb4a-24b1-4fb7-8994-3668d29ff042-kube-api-access-b8q74" (OuterVolumeSpecName: "kube-api-access-b8q74") pod "bd78fb4a-24b1-4fb7-8994-3668d29ff042" (UID: "bd78fb4a-24b1-4fb7-8994-3668d29ff042"). InnerVolumeSpecName "kube-api-access-b8q74". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.007229 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "bd78fb4a-24b1-4fb7-8994-3668d29ff042" (UID: "bd78fb4a-24b1-4fb7-8994-3668d29ff042"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.037490 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ct9xj-config-rplgt"] Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.054189 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bd78fb4a-24b1-4fb7-8994-3668d29ff042" (UID: "bd78fb4a-24b1-4fb7-8994-3668d29ff042"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.054230 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "bd78fb4a-24b1-4fb7-8994-3668d29ff042" (UID: "bd78fb4a-24b1-4fb7-8994-3668d29ff042"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.057861 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-scripts" (OuterVolumeSpecName: "scripts") pod "bd78fb4a-24b1-4fb7-8994-3668d29ff042" (UID: "bd78fb4a-24b1-4fb7-8994-3668d29ff042"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.097591 4765 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-dispersionconf\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.097630 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/bd78fb4a-24b1-4fb7-8994-3668d29ff042-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.097641 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.097651 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b8q74\" (UniqueName: \"kubernetes.io/projected/bd78fb4a-24b1-4fb7-8994-3668d29ff042-kube-api-access-b8q74\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.097663 4765 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/bd78fb4a-24b1-4fb7-8994-3668d29ff042-swiftconf\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:34 crc kubenswrapper[4765]: W0319 10:41:34.155077 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb6958393_8685_4df8_a7fb_bdd6a695a409.slice/crio-b90f3190d425d2a308e4509359361f709d919f3bd2851336d346b425615204e2 WatchSource:0}: Error finding container b90f3190d425d2a308e4509359361f709d919f3bd2851336d346b425615204e2: Status 404 returned error can't find the container with id b90f3190d425d2a308e4509359361f709d919f3bd2851336d346b425615204e2 Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.375520 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/root-account-create-update-zqlg2"] Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.570740 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-storage-0"] Mar 19 10:41:34 crc kubenswrapper[4765]: W0319 10:41:34.587459 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod21734dce_e034_473f_a919_7026f837ede2.slice/crio-392cac5260ad8b65b6ae32ccb83ae6d9c1794ddc7f9329092e4a56bd66e337bb WatchSource:0}: Error finding container 392cac5260ad8b65b6ae32ccb83ae6d9c1794ddc7f9329092e4a56bd66e337bb: Status 404 returned error can't find the container with id 392cac5260ad8b65b6ae32ccb83ae6d9c1794ddc7f9329092e4a56bd66e337bb Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.829063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2190a046-0d52-49c7-b2fd-aa113c2f3f99","Type":"ContainerStarted","Data":"da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9"} Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.829267 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.830338 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj-config-rplgt" event={"ID":"b6958393-8685-4df8-a7fb-bdd6a695a409","Type":"ContainerStarted","Data":"b90f3190d425d2a308e4509359361f709d919f3bd2851336d346b425615204e2"} Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.832202 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ccdb0a31-8b87-4024-848f-efebcf46e604","Type":"ContainerStarted","Data":"ac4ca5b206e9277cd39173f5dfd422276689843a7fc0ce7c0e0c3a65f43ef637"} Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.832386 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.833445 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zqlg2" event={"ID":"c4e7a7f0-2aaf-4929-89aa-c96424bfca68","Type":"ContainerStarted","Data":"b595cf0949d673f76838737b4513727d5468c642b8563d84bce559d2289f2264"} Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.834476 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/swift-ring-rebalance-f5w89" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.834468 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"392cac5260ad8b65b6ae32ccb83ae6d9c1794ddc7f9329092e4a56bd66e337bb"} Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.860348 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=48.285794349 podStartE2EDuration="1m1.860332018s" podCreationTimestamp="2026-03-19 10:40:33 +0000 UTC" firstStartedPulling="2026-03-19 10:40:41.704341431 +0000 UTC m=+1140.053286973" lastFinishedPulling="2026-03-19 10:40:55.2788791 +0000 UTC m=+1153.627824642" observedRunningTime="2026-03-19 10:41:34.858158119 +0000 UTC m=+1193.207103671" watchObservedRunningTime="2026-03-19 10:41:34.860332018 +0000 UTC m=+1193.209277560" Mar 19 10:41:34 crc kubenswrapper[4765]: I0319 10:41:34.901453 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=45.878909208 podStartE2EDuration="1m2.901431613s" podCreationTimestamp="2026-03-19 10:40:32 +0000 UTC" firstStartedPulling="2026-03-19 10:40:38.154938934 +0000 UTC m=+1136.503884476" lastFinishedPulling="2026-03-19 10:40:55.177461339 +0000 UTC m=+1153.526406881" observedRunningTime="2026-03-19 10:41:34.897832205 +0000 UTC m=+1193.246777747" watchObservedRunningTime="2026-03-19 10:41:34.901431613 +0000 UTC m=+1193.250377155" Mar 19 10:41:35 crc kubenswrapper[4765]: I0319 10:41:35.846675 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj-config-rplgt" event={"ID":"b6958393-8685-4df8-a7fb-bdd6a695a409","Type":"ContainerStarted","Data":"34a946f35cc460e5df13d903d4f2c1bd42982ba98394d7334cb6c9b9790ff5e2"} Mar 19 10:41:36 crc kubenswrapper[4765]: I0319 10:41:36.858785 4765 generic.go:334] "Generic (PLEG): container finished" podID="b6958393-8685-4df8-a7fb-bdd6a695a409" containerID="34a946f35cc460e5df13d903d4f2c1bd42982ba98394d7334cb6c9b9790ff5e2" exitCode=0 Mar 19 10:41:36 crc kubenswrapper[4765]: I0319 10:41:36.858944 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj-config-rplgt" event={"ID":"b6958393-8685-4df8-a7fb-bdd6a695a409","Type":"ContainerDied","Data":"34a946f35cc460e5df13d903d4f2c1bd42982ba98394d7334cb6c9b9790ff5e2"} Mar 19 10:41:36 crc kubenswrapper[4765]: I0319 10:41:36.861893 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zqlg2" event={"ID":"c4e7a7f0-2aaf-4929-89aa-c96424bfca68","Type":"ContainerStarted","Data":"1cc3d8725f02e33192f2fee9bfc932918d270bd09ff573e83da6f36acd68e948"} Mar 19 10:41:37 crc kubenswrapper[4765]: I0319 10:41:37.812195 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ovn-controller-ct9xj" Mar 19 10:41:44 crc kubenswrapper[4765]: I0319 10:41:44.344183 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-cell1-server-0" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.101:5671: connect: connection refused" Mar 19 10:41:44 crc kubenswrapper[4765]: I0319 10:41:44.678482 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/rabbitmq-server-0" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerName="rabbitmq" probeResult="failure" output="dial tcp 10.217.0.102:5671: connect: connection refused" Mar 19 10:41:44 crc kubenswrapper[4765]: I0319 10:41:44.925447 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ds8t6" event={"ID":"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0","Type":"ContainerStarted","Data":"ee29eba42d553c93953418fe06ecbf9d6ed4b1cd71a398e57f4359ab6ab93961"} Mar 19 10:41:44 crc kubenswrapper[4765]: I0319 10:41:44.927781 4765 generic.go:334] "Generic (PLEG): container finished" podID="c4e7a7f0-2aaf-4929-89aa-c96424bfca68" containerID="1cc3d8725f02e33192f2fee9bfc932918d270bd09ff573e83da6f36acd68e948" exitCode=0 Mar 19 10:41:44 crc kubenswrapper[4765]: I0319 10:41:44.927940 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zqlg2" event={"ID":"c4e7a7f0-2aaf-4929-89aa-c96424bfca68","Type":"ContainerDied","Data":"1cc3d8725f02e33192f2fee9bfc932918d270bd09ff573e83da6f36acd68e948"} Mar 19 10:41:45 crc kubenswrapper[4765]: I0319 10:41:45.952827 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"082b7a110bce10cd11dba73396567b33c0376edd3eaa313d022003bf8eb4df9e"} Mar 19 10:41:45 crc kubenswrapper[4765]: I0319 10:41:45.953492 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"8da6b4a15a064b37ff277eebff27b535a264a2c2ebbe18f79e93d393a4f6ffb6"} Mar 19 10:41:45 crc kubenswrapper[4765]: I0319 10:41:45.953504 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"e5b282bfb4980acd33930ed4d33f5d22358b332bbf24f1f377d06c882af210cf"} Mar 19 10:41:45 crc kubenswrapper[4765]: I0319 10:41:45.953512 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"a743188467c0d6705b70af02f386d07d2ec814ea7bae3a8156dfafb9f5326400"} Mar 19 10:41:45 crc kubenswrapper[4765]: I0319 10:41:45.977317 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-db-sync-ds8t6" podStartSLOduration=13.442682283 podStartE2EDuration="28.977292633s" podCreationTimestamp="2026-03-19 10:41:17 +0000 UTC" firstStartedPulling="2026-03-19 10:41:18.891942533 +0000 UTC m=+1177.240888075" lastFinishedPulling="2026-03-19 10:41:34.426552893 +0000 UTC m=+1192.775498425" observedRunningTime="2026-03-19 10:41:45.975371491 +0000 UTC m=+1204.324317033" watchObservedRunningTime="2026-03-19 10:41:45.977292633 +0000 UTC m=+1204.326238175" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.351152 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.356373 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.426497 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hc9k4\" (UniqueName: \"kubernetes.io/projected/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-kube-api-access-hc9k4\") pod \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.426559 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run-ovn\") pod \"b6958393-8685-4df8-a7fb-bdd6a695a409\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.426627 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qskjz\" (UniqueName: \"kubernetes.io/projected/b6958393-8685-4df8-a7fb-bdd6a695a409-kube-api-access-qskjz\") pod \"b6958393-8685-4df8-a7fb-bdd6a695a409\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.426677 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run\") pod \"b6958393-8685-4df8-a7fb-bdd6a695a409\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.426766 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "b6958393-8685-4df8-a7fb-bdd6a695a409" (UID: "b6958393-8685-4df8-a7fb-bdd6a695a409"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.426790 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-operator-scripts\") pod \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\" (UID: \"c4e7a7f0-2aaf-4929-89aa-c96424bfca68\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.426986 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-additional-scripts\") pod \"b6958393-8685-4df8-a7fb-bdd6a695a409\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.427069 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-log-ovn\") pod \"b6958393-8685-4df8-a7fb-bdd6a695a409\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.427120 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-scripts\") pod \"b6958393-8685-4df8-a7fb-bdd6a695a409\" (UID: \"b6958393-8685-4df8-a7fb-bdd6a695a409\") " Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.427186 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run" (OuterVolumeSpecName: "var-run") pod "b6958393-8685-4df8-a7fb-bdd6a695a409" (UID: "b6958393-8685-4df8-a7fb-bdd6a695a409"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.427276 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "b6958393-8685-4df8-a7fb-bdd6a695a409" (UID: "b6958393-8685-4df8-a7fb-bdd6a695a409"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.427775 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "b6958393-8685-4df8-a7fb-bdd6a695a409" (UID: "b6958393-8685-4df8-a7fb-bdd6a695a409"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.428000 4765 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.428028 4765 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.428042 4765 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/b6958393-8685-4df8-a7fb-bdd6a695a409-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.428057 4765 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.428066 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "c4e7a7f0-2aaf-4929-89aa-c96424bfca68" (UID: "c4e7a7f0-2aaf-4929-89aa-c96424bfca68"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.428218 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-scripts" (OuterVolumeSpecName: "scripts") pod "b6958393-8685-4df8-a7fb-bdd6a695a409" (UID: "b6958393-8685-4df8-a7fb-bdd6a695a409"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.439410 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-kube-api-access-hc9k4" (OuterVolumeSpecName: "kube-api-access-hc9k4") pod "c4e7a7f0-2aaf-4929-89aa-c96424bfca68" (UID: "c4e7a7f0-2aaf-4929-89aa-c96424bfca68"). InnerVolumeSpecName "kube-api-access-hc9k4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.445100 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6958393-8685-4df8-a7fb-bdd6a695a409-kube-api-access-qskjz" (OuterVolumeSpecName: "kube-api-access-qskjz") pod "b6958393-8685-4df8-a7fb-bdd6a695a409" (UID: "b6958393-8685-4df8-a7fb-bdd6a695a409"). InnerVolumeSpecName "kube-api-access-qskjz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.529693 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b6958393-8685-4df8-a7fb-bdd6a695a409-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.529735 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hc9k4\" (UniqueName: \"kubernetes.io/projected/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-kube-api-access-hc9k4\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.529749 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qskjz\" (UniqueName: \"kubernetes.io/projected/b6958393-8685-4df8-a7fb-bdd6a695a409-kube-api-access-qskjz\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.529765 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/c4e7a7f0-2aaf-4929-89aa-c96424bfca68-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.964215 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj-config-rplgt" event={"ID":"b6958393-8685-4df8-a7fb-bdd6a695a409","Type":"ContainerDied","Data":"b90f3190d425d2a308e4509359361f709d919f3bd2851336d346b425615204e2"} Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.964276 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b90f3190d425d2a308e4509359361f709d919f3bd2851336d346b425615204e2" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.964373 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-rplgt" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.971003 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/root-account-create-update-zqlg2" event={"ID":"c4e7a7f0-2aaf-4929-89aa-c96424bfca68","Type":"ContainerDied","Data":"b595cf0949d673f76838737b4513727d5468c642b8563d84bce559d2289f2264"} Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.971038 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b595cf0949d673f76838737b4513727d5468c642b8563d84bce559d2289f2264" Mar 19 10:41:46 crc kubenswrapper[4765]: I0319 10:41:46.971105 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/root-account-create-update-zqlg2" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.492071 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ct9xj-config-rplgt"] Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.501883 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ct9xj-config-rplgt"] Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.639818 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-controller-ct9xj-config-q8849"] Mar 19 10:41:47 crc kubenswrapper[4765]: E0319 10:41:47.640271 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6958393-8685-4df8-a7fb-bdd6a695a409" containerName="ovn-config" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.640296 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6958393-8685-4df8-a7fb-bdd6a695a409" containerName="ovn-config" Mar 19 10:41:47 crc kubenswrapper[4765]: E0319 10:41:47.640318 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c4e7a7f0-2aaf-4929-89aa-c96424bfca68" containerName="mariadb-account-create-update" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.640328 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c4e7a7f0-2aaf-4929-89aa-c96424bfca68" containerName="mariadb-account-create-update" Mar 19 10:41:47 crc kubenswrapper[4765]: E0319 10:41:47.640353 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd78fb4a-24b1-4fb7-8994-3668d29ff042" containerName="swift-ring-rebalance" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.640361 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd78fb4a-24b1-4fb7-8994-3668d29ff042" containerName="swift-ring-rebalance" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.640580 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6958393-8685-4df8-a7fb-bdd6a695a409" containerName="ovn-config" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.640604 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c4e7a7f0-2aaf-4929-89aa-c96424bfca68" containerName="mariadb-account-create-update" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.640618 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd78fb4a-24b1-4fb7-8994-3668d29ff042" containerName="swift-ring-rebalance" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.642377 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.657840 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-extra-scripts" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.668281 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ct9xj-config-q8849"] Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.761609 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xppdb\" (UniqueName: \"kubernetes.io/projected/f9f17842-7a45-43e9-bb84-03ffa0a65f86-kube-api-access-xppdb\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.761685 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run-ovn\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.762069 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-scripts\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.762103 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-log-ovn\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.762155 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.762179 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-additional-scripts\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.867871 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-scripts\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868242 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-log-ovn\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868283 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868299 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-additional-scripts\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868330 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xppdb\" (UniqueName: \"kubernetes.io/projected/f9f17842-7a45-43e9-bb84-03ffa0a65f86-kube-api-access-xppdb\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868357 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run-ovn\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868754 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run-ovn\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868810 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-log-ovn\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.868846 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.869462 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-additional-scripts\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.871162 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-scripts\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.900861 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xppdb\" (UniqueName: \"kubernetes.io/projected/f9f17842-7a45-43e9-bb84-03ffa0a65f86-kube-api-access-xppdb\") pod \"ovn-controller-ct9xj-config-q8849\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.989399 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.995542 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"9f616ab79ebf90f19b8500d85635868fea46da035a9f1b92ce665345e18a41c9"} Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.995598 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"2fa3550813a5b844f90e090122122c8638fc43365b3e3320a79c471c2517a06d"} Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.995609 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"2efca533b101df5f8bab21a50f36436d9a21488f008d9331c3973302df1a5484"} Mar 19 10:41:47 crc kubenswrapper[4765]: I0319 10:41:47.995621 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"c776b0a215c53293aea6abcc02b3256d08715644810b39327802d8f21db06093"} Mar 19 10:41:48 crc kubenswrapper[4765]: I0319 10:41:48.322235 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-controller-ct9xj-config-q8849"] Mar 19 10:41:48 crc kubenswrapper[4765]: I0319 10:41:48.369106 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6958393-8685-4df8-a7fb-bdd6a695a409" path="/var/lib/kubelet/pods/b6958393-8685-4df8-a7fb-bdd6a695a409/volumes" Mar 19 10:41:49 crc kubenswrapper[4765]: I0319 10:41:49.008027 4765 generic.go:334] "Generic (PLEG): container finished" podID="f9f17842-7a45-43e9-bb84-03ffa0a65f86" containerID="405a6fd882b389058788691b6724f0ceecc9e069f7f80c7163dd4f20685d4d51" exitCode=0 Mar 19 10:41:49 crc kubenswrapper[4765]: I0319 10:41:49.008336 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj-config-q8849" event={"ID":"f9f17842-7a45-43e9-bb84-03ffa0a65f86","Type":"ContainerDied","Data":"405a6fd882b389058788691b6724f0ceecc9e069f7f80c7163dd4f20685d4d51"} Mar 19 10:41:49 crc kubenswrapper[4765]: I0319 10:41:49.008368 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj-config-q8849" event={"ID":"f9f17842-7a45-43e9-bb84-03ffa0a65f86","Type":"ContainerStarted","Data":"0f659073b670b185d5563a01c8d665ee18c86f1b09f1af00a3b0bc89ac2c5079"} Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.033261 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"2533224e09bec8aeb7facbeab2c69fa9049c9b5fbe6b98531a4c5635dfbef24d"} Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.033926 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"0f1c411fdccf6f95db6460819cf88f1932fcf7d2b25517584f0d3705516321fc"} Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.033941 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"39f1f2f5029011501a8cd7700ef7a1f838ca80a7b127256aa145fd3c1cc98e7d"} Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.033979 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"fd95511c05e443814319d5b1d176e6e50edd5ee6cf334a5eb0ff336a9b70499b"} Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.033990 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"b33fdc3b2ef1020d9a604f99d73b145d5b24c14ab7009308f4f91ca6b032a313"} Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.593136 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.766713 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-scripts\") pod \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.766770 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-additional-scripts\") pod \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.766932 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-log-ovn\") pod \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.767021 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run-ovn\") pod \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.767087 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run\") pod \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.767165 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xppdb\" (UniqueName: \"kubernetes.io/projected/f9f17842-7a45-43e9-bb84-03ffa0a65f86-kube-api-access-xppdb\") pod \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\" (UID: \"f9f17842-7a45-43e9-bb84-03ffa0a65f86\") " Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.768105 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-additional-scripts" (OuterVolumeSpecName: "additional-scripts") pod "f9f17842-7a45-43e9-bb84-03ffa0a65f86" (UID: "f9f17842-7a45-43e9-bb84-03ffa0a65f86"). InnerVolumeSpecName "additional-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.769070 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-scripts" (OuterVolumeSpecName: "scripts") pod "f9f17842-7a45-43e9-bb84-03ffa0a65f86" (UID: "f9f17842-7a45-43e9-bb84-03ffa0a65f86"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.769776 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run-ovn" (OuterVolumeSpecName: "var-run-ovn") pod "f9f17842-7a45-43e9-bb84-03ffa0a65f86" (UID: "f9f17842-7a45-43e9-bb84-03ffa0a65f86"). InnerVolumeSpecName "var-run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.769808 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-log-ovn" (OuterVolumeSpecName: "var-log-ovn") pod "f9f17842-7a45-43e9-bb84-03ffa0a65f86" (UID: "f9f17842-7a45-43e9-bb84-03ffa0a65f86"). InnerVolumeSpecName "var-log-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.769826 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run" (OuterVolumeSpecName: "var-run") pod "f9f17842-7a45-43e9-bb84-03ffa0a65f86" (UID: "f9f17842-7a45-43e9-bb84-03ffa0a65f86"). InnerVolumeSpecName "var-run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.774334 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9f17842-7a45-43e9-bb84-03ffa0a65f86-kube-api-access-xppdb" (OuterVolumeSpecName: "kube-api-access-xppdb") pod "f9f17842-7a45-43e9-bb84-03ffa0a65f86" (UID: "f9f17842-7a45-43e9-bb84-03ffa0a65f86"). InnerVolumeSpecName "kube-api-access-xppdb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.869420 4765 reconciler_common.go:293] "Volume detached for volume \"var-log-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-log-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.869463 4765 reconciler_common.go:293] "Volume detached for volume \"var-run-ovn\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.869476 4765 reconciler_common.go:293] "Volume detached for volume \"var-run\" (UniqueName: \"kubernetes.io/host-path/f9f17842-7a45-43e9-bb84-03ffa0a65f86-var-run\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.869487 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xppdb\" (UniqueName: \"kubernetes.io/projected/f9f17842-7a45-43e9-bb84-03ffa0a65f86-kube-api-access-xppdb\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.869499 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:50 crc kubenswrapper[4765]: I0319 10:41:50.869506 4765 reconciler_common.go:293] "Volume detached for volume \"additional-scripts\" (UniqueName: \"kubernetes.io/configmap/f9f17842-7a45-43e9-bb84-03ffa0a65f86-additional-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.044771 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-controller-ct9xj-config-q8849" event={"ID":"f9f17842-7a45-43e9-bb84-03ffa0a65f86","Type":"ContainerDied","Data":"0f659073b670b185d5563a01c8d665ee18c86f1b09f1af00a3b0bc89ac2c5079"} Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.044827 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f659073b670b185d5563a01c8d665ee18c86f1b09f1af00a3b0bc89ac2c5079" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.044838 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-controller-ct9xj-config-q8849" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.052838 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"166efec6024514f380cc28c9fb5865caf77095711b2d8354f08348584e918d46"} Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.052883 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-storage-0" event={"ID":"21734dce-e034-473f-a919-7026f837ede2","Type":"ContainerStarted","Data":"b23c883e26340aee4145f4b01d7a0f60e19eebf7be8c907a3bf45165edaa6a44"} Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.104366 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-storage-0" podStartSLOduration=27.776809888 podStartE2EDuration="42.104344159s" podCreationTimestamp="2026-03-19 10:41:09 +0000 UTC" firstStartedPulling="2026-03-19 10:41:34.590312615 +0000 UTC m=+1192.939258157" lastFinishedPulling="2026-03-19 10:41:48.917846886 +0000 UTC m=+1207.266792428" observedRunningTime="2026-03-19 10:41:51.100624188 +0000 UTC m=+1209.449569750" watchObservedRunningTime="2026-03-19 10:41:51.104344159 +0000 UTC m=+1209.453289701" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.420458 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jfh7n"] Mar 19 10:41:51 crc kubenswrapper[4765]: E0319 10:41:51.421065 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9f17842-7a45-43e9-bb84-03ffa0a65f86" containerName="ovn-config" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.421095 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9f17842-7a45-43e9-bb84-03ffa0a65f86" containerName="ovn-config" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.421435 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9f17842-7a45-43e9-bb84-03ffa0a65f86" containerName="ovn-config" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.426167 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.430098 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-swift-storage-0" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.433530 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jfh7n"] Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.583731 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.584109 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.584139 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.584158 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-config\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.584206 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ktgh\" (UniqueName: \"kubernetes.io/projected/b746149d-0502-49df-b998-44cc51484918-kube-api-access-9ktgh\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.584227 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.662713 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ovn-controller-ct9xj-config-q8849"] Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.670754 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ovn-controller-ct9xj-config-q8849"] Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.685926 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ktgh\" (UniqueName: \"kubernetes.io/projected/b746149d-0502-49df-b998-44cc51484918-kube-api-access-9ktgh\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.686003 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.686079 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.686155 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.686186 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.686213 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-config\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.687439 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-sb\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.687446 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-nb\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.687498 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-swift-storage-0\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.687501 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-svc\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.687753 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-config\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.715905 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ktgh\" (UniqueName: \"kubernetes.io/projected/b746149d-0502-49df-b998-44cc51484918-kube-api-access-9ktgh\") pod \"dnsmasq-dns-5c79d794d7-jfh7n\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:51 crc kubenswrapper[4765]: I0319 10:41:51.760045 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:52 crc kubenswrapper[4765]: I0319 10:41:52.209425 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jfh7n"] Mar 19 10:41:52 crc kubenswrapper[4765]: I0319 10:41:52.379400 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f9f17842-7a45-43e9-bb84-03ffa0a65f86" path="/var/lib/kubelet/pods/f9f17842-7a45-43e9-bb84-03ffa0a65f86/volumes" Mar 19 10:41:53 crc kubenswrapper[4765]: I0319 10:41:53.072934 4765 generic.go:334] "Generic (PLEG): container finished" podID="b746149d-0502-49df-b998-44cc51484918" containerID="bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e" exitCode=0 Mar 19 10:41:53 crc kubenswrapper[4765]: I0319 10:41:53.073267 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" event={"ID":"b746149d-0502-49df-b998-44cc51484918","Type":"ContainerDied","Data":"bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e"} Mar 19 10:41:53 crc kubenswrapper[4765]: I0319 10:41:53.073293 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" event={"ID":"b746149d-0502-49df-b998-44cc51484918","Type":"ContainerStarted","Data":"abbe4a718e180061399a7f8002d36a1eded37f31e93fee7d0c610958fdc60c1c"} Mar 19 10:41:53 crc kubenswrapper[4765]: I0319 10:41:53.075999 4765 generic.go:334] "Generic (PLEG): container finished" podID="06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" containerID="ee29eba42d553c93953418fe06ecbf9d6ed4b1cd71a398e57f4359ab6ab93961" exitCode=0 Mar 19 10:41:53 crc kubenswrapper[4765]: I0319 10:41:53.076045 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ds8t6" event={"ID":"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0","Type":"ContainerDied","Data":"ee29eba42d553c93953418fe06ecbf9d6ed4b1cd71a398e57f4359ab6ab93961"} Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.088331 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" event={"ID":"b746149d-0502-49df-b998-44cc51484918","Type":"ContainerStarted","Data":"8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088"} Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.119209 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" podStartSLOduration=3.119189838 podStartE2EDuration="3.119189838s" podCreationTimestamp="2026-03-19 10:41:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:41:54.114209763 +0000 UTC m=+1212.463155305" watchObservedRunningTime="2026-03-19 10:41:54.119189838 +0000 UTC m=+1212.468135380" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.343622 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.622143 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.684621 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.741755 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-config-data\") pod \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.741815 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-db-sync-config-data\") pod \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.741869 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vg4kv\" (UniqueName: \"kubernetes.io/projected/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-kube-api-access-vg4kv\") pod \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.741943 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-combined-ca-bundle\") pod \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\" (UID: \"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0\") " Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.755679 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-kube-api-access-vg4kv" (OuterVolumeSpecName: "kube-api-access-vg4kv") pod "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" (UID: "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0"). InnerVolumeSpecName "kube-api-access-vg4kv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.757094 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" (UID: "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.789808 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" (UID: "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.807093 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-config-data" (OuterVolumeSpecName: "config-data") pod "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" (UID: "06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.844024 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.844059 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.844071 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vg4kv\" (UniqueName: \"kubernetes.io/projected/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-kube-api-access-vg4kv\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:54 crc kubenswrapper[4765]: I0319 10:41:54.844079 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.097548 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-db-sync-ds8t6" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.097607 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-db-sync-ds8t6" event={"ID":"06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0","Type":"ContainerDied","Data":"075ce43fd7a2462768ad0c53fa07b989bd02bd483991f5dd0f01f9ac29344f4d"} Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.097781 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="075ce43fd7a2462768ad0c53fa07b989bd02bd483991f5dd0f01f9ac29344f4d" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.097806 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.692043 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jfh7n"] Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.738759 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-fwp2t"] Mar 19 10:41:55 crc kubenswrapper[4765]: E0319 10:41:55.739137 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" containerName="glance-db-sync" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.739155 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" containerName="glance-db-sync" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.739319 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" containerName="glance-db-sync" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.740142 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.758855 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-fwp2t"] Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.860247 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.860325 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.860358 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.860408 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-config\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.860618 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd79b\" (UniqueName: \"kubernetes.io/projected/c66aa28a-5eba-4c64-abec-75b6577131a4-kube-api-access-fd79b\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.860660 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.962349 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd79b\" (UniqueName: \"kubernetes.io/projected/c66aa28a-5eba-4c64-abec-75b6577131a4-kube-api-access-fd79b\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.962424 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.962519 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.962567 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.962601 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.962651 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-config\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.964193 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-config\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.964222 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-nb\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.964296 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-swift-storage-0\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.964403 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-svc\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.964534 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-sb\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:55 crc kubenswrapper[4765]: I0319 10:41:55.987123 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd79b\" (UniqueName: \"kubernetes.io/projected/c66aa28a-5eba-4c64-abec-75b6577131a4-kube-api-access-fd79b\") pod \"dnsmasq-dns-5f59b8f679-fwp2t\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:56 crc kubenswrapper[4765]: I0319 10:41:56.058602 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:56 crc kubenswrapper[4765]: I0319 10:41:56.580373 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-fwp2t"] Mar 19 10:41:56 crc kubenswrapper[4765]: W0319 10:41:56.581288 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc66aa28a_5eba_4c64_abec_75b6577131a4.slice/crio-616e3b1f592660803ed84428701c5cb1fade2c75955d53b01d14e5b61416273e WatchSource:0}: Error finding container 616e3b1f592660803ed84428701c5cb1fade2c75955d53b01d14e5b61416273e: Status 404 returned error can't find the container with id 616e3b1f592660803ed84428701c5cb1fade2c75955d53b01d14e5b61416273e Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.028701 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-create-6fk75"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.030346 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.039360 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6fk75"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.081279 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jldbj\" (UniqueName: \"kubernetes.io/projected/a4914033-41f9-467d-b55e-c1d89a8fab4b-kube-api-access-jldbj\") pod \"cinder-db-create-6fk75\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.081339 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4914033-41f9-467d-b55e-c1d89a8fab4b-operator-scripts\") pod \"cinder-db-create-6fk75\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.134068 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-bedf-account-create-update-s9kdd"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.135527 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.141624 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-db-secret" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.146912 4765 generic.go:334] "Generic (PLEG): container finished" podID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerID="5f76ded8fa06729ab4d0d7bd559c713625500321d87cdd2fbaa1e57208c35577" exitCode=0 Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.147297 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" podUID="b746149d-0502-49df-b998-44cc51484918" containerName="dnsmasq-dns" containerID="cri-o://8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088" gracePeriod=10 Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.151049 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" event={"ID":"c66aa28a-5eba-4c64-abec-75b6577131a4","Type":"ContainerDied","Data":"5f76ded8fa06729ab4d0d7bd559c713625500321d87cdd2fbaa1e57208c35577"} Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.151110 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" event={"ID":"c66aa28a-5eba-4c64-abec-75b6577131a4","Type":"ContainerStarted","Data":"616e3b1f592660803ed84428701c5cb1fade2c75955d53b01d14e5b61416273e"} Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.153800 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bedf-account-create-update-s9kdd"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.182313 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jldbj\" (UniqueName: \"kubernetes.io/projected/a4914033-41f9-467d-b55e-c1d89a8fab4b-kube-api-access-jldbj\") pod \"cinder-db-create-6fk75\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.182374 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4914033-41f9-467d-b55e-c1d89a8fab4b-operator-scripts\") pod \"cinder-db-create-6fk75\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.182428 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t4ctp\" (UniqueName: \"kubernetes.io/projected/549655a2-d327-424f-ae72-6491fa466bdd-kube-api-access-t4ctp\") pod \"cinder-bedf-account-create-update-s9kdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.182518 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/549655a2-d327-424f-ae72-6491fa466bdd-operator-scripts\") pod \"cinder-bedf-account-create-update-s9kdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.184420 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4914033-41f9-467d-b55e-c1d89a8fab4b-operator-scripts\") pod \"cinder-db-create-6fk75\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.223044 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jldbj\" (UniqueName: \"kubernetes.io/projected/a4914033-41f9-467d-b55e-c1d89a8fab4b-kube-api-access-jldbj\") pod \"cinder-db-create-6fk75\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.303661 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/549655a2-d327-424f-ae72-6491fa466bdd-operator-scripts\") pod \"cinder-bedf-account-create-update-s9kdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.304007 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t4ctp\" (UniqueName: \"kubernetes.io/projected/549655a2-d327-424f-ae72-6491fa466bdd-kube-api-access-t4ctp\") pod \"cinder-bedf-account-create-update-s9kdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.304689 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/549655a2-d327-424f-ae72-6491fa466bdd-operator-scripts\") pod \"cinder-bedf-account-create-update-s9kdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.339247 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-create-6jl85"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.339698 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t4ctp\" (UniqueName: \"kubernetes.io/projected/549655a2-d327-424f-ae72-6491fa466bdd-kube-api-access-t4ctp\") pod \"cinder-bedf-account-create-update-s9kdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.340847 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.350603 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fk75" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.352862 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6jl85"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.463782 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453b3d54-509f-4f11-a718-0bd8e271953e-operator-scripts\") pod \"barbican-db-create-6jl85\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.463978 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzvhn\" (UniqueName: \"kubernetes.io/projected/453b3d54-509f-4f11-a718-0bd8e271953e-kube-api-access-lzvhn\") pod \"barbican-db-create-6jl85\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.500192 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-507b-account-create-update-jxpl9"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.507063 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.525498 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.532201 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-507b-account-create-update-jxpl9"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.551293 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-db-secret" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.561120 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-create-h2h92"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.567356 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453b3d54-509f-4f11-a718-0bd8e271953e-operator-scripts\") pod \"barbican-db-create-6jl85\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.567510 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lzvhn\" (UniqueName: \"kubernetes.io/projected/453b3d54-509f-4f11-a718-0bd8e271953e-kube-api-access-lzvhn\") pod \"barbican-db-create-6jl85\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.568621 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.569584 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453b3d54-509f-4f11-a718-0bd8e271953e-operator-scripts\") pod \"barbican-db-create-6jl85\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.648972 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzvhn\" (UniqueName: \"kubernetes.io/projected/453b3d54-509f-4f11-a718-0bd8e271953e-kube-api-access-lzvhn\") pod \"barbican-db-create-6jl85\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.652310 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-h2h92"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.669475 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mhj7\" (UniqueName: \"kubernetes.io/projected/95479eb2-1db7-4def-a4da-5ce9dbf85e13-kube-api-access-4mhj7\") pod \"barbican-507b-account-create-update-jxpl9\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.669564 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qfggb\" (UniqueName: \"kubernetes.io/projected/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-kube-api-access-qfggb\") pod \"neutron-db-create-h2h92\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.669691 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-operator-scripts\") pod \"neutron-db-create-h2h92\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.669723 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95479eb2-1db7-4def-a4da-5ce9dbf85e13-operator-scripts\") pod \"barbican-507b-account-create-update-jxpl9\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.721059 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-31fd-account-create-update-9b8kl"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.722753 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.741132 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-31fd-account-create-update-9b8kl"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.750902 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-db-secret" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.776806 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-db-sync-597fd"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.781167 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-operator-scripts\") pod \"neutron-db-create-h2h92\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.781224 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95479eb2-1db7-4def-a4da-5ce9dbf85e13-operator-scripts\") pod \"barbican-507b-account-create-update-jxpl9\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.781252 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8nknf\" (UniqueName: \"kubernetes.io/projected/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-kube-api-access-8nknf\") pod \"neutron-31fd-account-create-update-9b8kl\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.781279 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4mhj7\" (UniqueName: \"kubernetes.io/projected/95479eb2-1db7-4def-a4da-5ce9dbf85e13-kube-api-access-4mhj7\") pod \"barbican-507b-account-create-update-jxpl9\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.781303 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-operator-scripts\") pod \"neutron-31fd-account-create-update-9b8kl\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.781332 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qfggb\" (UniqueName: \"kubernetes.io/projected/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-kube-api-access-qfggb\") pod \"neutron-db-create-h2h92\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.782398 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-operator-scripts\") pod \"neutron-db-create-h2h92\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.783154 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95479eb2-1db7-4def-a4da-5ce9dbf85e13-operator-scripts\") pod \"barbican-507b-account-create-update-jxpl9\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.786878 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.790722 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-597fd"] Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.802675 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.803053 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zvnkb" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.803310 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.809674 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.821402 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qfggb\" (UniqueName: \"kubernetes.io/projected/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-kube-api-access-qfggb\") pod \"neutron-db-create-h2h92\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.836579 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mhj7\" (UniqueName: \"kubernetes.io/projected/95479eb2-1db7-4def-a4da-5ce9dbf85e13-kube-api-access-4mhj7\") pod \"barbican-507b-account-create-update-jxpl9\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.867970 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jl85" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.883003 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qk4jj\" (UniqueName: \"kubernetes.io/projected/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-kube-api-access-qk4jj\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.883079 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8nknf\" (UniqueName: \"kubernetes.io/projected/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-kube-api-access-8nknf\") pod \"neutron-31fd-account-create-update-9b8kl\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.883116 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-operator-scripts\") pod \"neutron-31fd-account-create-update-9b8kl\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.883195 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-config-data\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.883221 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-combined-ca-bundle\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.884288 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-operator-scripts\") pod \"neutron-31fd-account-create-update-9b8kl\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.921026 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.922581 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8nknf\" (UniqueName: \"kubernetes.io/projected/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-kube-api-access-8nknf\") pod \"neutron-31fd-account-create-update-9b8kl\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.969652 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h2h92" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.985161 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-config-data\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.985212 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-combined-ca-bundle\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.985287 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qk4jj\" (UniqueName: \"kubernetes.io/projected/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-kube-api-access-qk4jj\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:57 crc kubenswrapper[4765]: I0319 10:41:57.999893 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-config-data\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.007921 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-combined-ca-bundle\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.034044 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qk4jj\" (UniqueName: \"kubernetes.io/projected/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-kube-api-access-qk4jj\") pod \"keystone-db-sync-597fd\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.134465 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.199818 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-597fd" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.211814 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" event={"ID":"c66aa28a-5eba-4c64-abec-75b6577131a4","Type":"ContainerStarted","Data":"371b1829bc0d6f74ef0efb4b0612ae0419ac2477162a39935d459530c1b4c6ba"} Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.212151 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.215413 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.226299 4765 generic.go:334] "Generic (PLEG): container finished" podID="b746149d-0502-49df-b998-44cc51484918" containerID="8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088" exitCode=0 Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.226374 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" event={"ID":"b746149d-0502-49df-b998-44cc51484918","Type":"ContainerDied","Data":"8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088"} Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.226430 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" event={"ID":"b746149d-0502-49df-b998-44cc51484918","Type":"ContainerDied","Data":"abbe4a718e180061399a7f8002d36a1eded37f31e93fee7d0c610958fdc60c1c"} Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.226452 4765 scope.go:117] "RemoveContainer" containerID="8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.291029 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-svc\") pod \"b746149d-0502-49df-b998-44cc51484918\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.291086 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-swift-storage-0\") pod \"b746149d-0502-49df-b998-44cc51484918\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.291158 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-config\") pod \"b746149d-0502-49df-b998-44cc51484918\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.291206 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-nb\") pod \"b746149d-0502-49df-b998-44cc51484918\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.291301 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9ktgh\" (UniqueName: \"kubernetes.io/projected/b746149d-0502-49df-b998-44cc51484918-kube-api-access-9ktgh\") pod \"b746149d-0502-49df-b998-44cc51484918\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.291377 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-sb\") pod \"b746149d-0502-49df-b998-44cc51484918\" (UID: \"b746149d-0502-49df-b998-44cc51484918\") " Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.304056 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b746149d-0502-49df-b998-44cc51484918-kube-api-access-9ktgh" (OuterVolumeSpecName: "kube-api-access-9ktgh") pod "b746149d-0502-49df-b998-44cc51484918" (UID: "b746149d-0502-49df-b998-44cc51484918"). InnerVolumeSpecName "kube-api-access-9ktgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.320227 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" podStartSLOduration=3.320185477 podStartE2EDuration="3.320185477s" podCreationTimestamp="2026-03-19 10:41:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:41:58.287040848 +0000 UTC m=+1216.635986400" watchObservedRunningTime="2026-03-19 10:41:58.320185477 +0000 UTC m=+1216.669131019" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.341106 4765 scope.go:117] "RemoveContainer" containerID="bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.357536 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "b746149d-0502-49df-b998-44cc51484918" (UID: "b746149d-0502-49df-b998-44cc51484918"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.395293 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.395779 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9ktgh\" (UniqueName: \"kubernetes.io/projected/b746149d-0502-49df-b998-44cc51484918-kube-api-access-9ktgh\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.413532 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-config" (OuterVolumeSpecName: "config") pod "b746149d-0502-49df-b998-44cc51484918" (UID: "b746149d-0502-49df-b998-44cc51484918"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.422671 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "b746149d-0502-49df-b998-44cc51484918" (UID: "b746149d-0502-49df-b998-44cc51484918"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.423259 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "b746149d-0502-49df-b998-44cc51484918" (UID: "b746149d-0502-49df-b998-44cc51484918"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.443336 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-create-6fk75"] Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.463184 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "b746149d-0502-49df-b998-44cc51484918" (UID: "b746149d-0502-49df-b998-44cc51484918"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.477302 4765 scope.go:117] "RemoveContainer" containerID="8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088" Mar 19 10:41:58 crc kubenswrapper[4765]: E0319 10:41:58.480676 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088\": container with ID starting with 8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088 not found: ID does not exist" containerID="8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.480719 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088"} err="failed to get container status \"8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088\": rpc error: code = NotFound desc = could not find container \"8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088\": container with ID starting with 8d5b4ab3f1f01df877b53081f86edcbe504fa61d8c3212f752ef14f9fd02c088 not found: ID does not exist" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.480753 4765 scope.go:117] "RemoveContainer" containerID="bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e" Mar 19 10:41:58 crc kubenswrapper[4765]: E0319 10:41:58.481106 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e\": container with ID starting with bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e not found: ID does not exist" containerID="bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.481131 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e"} err="failed to get container status \"bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e\": rpc error: code = NotFound desc = could not find container \"bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e\": container with ID starting with bcbc3c9e55fca08746880a7115af131f29dd10fec188972d3d7ed7ec48b4b58e not found: ID does not exist" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.500188 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.500235 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.500248 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.500261 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b746149d-0502-49df-b998-44cc51484918-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.530362 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-bedf-account-create-update-s9kdd"] Mar 19 10:41:58 crc kubenswrapper[4765]: W0319 10:41:58.539343 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod549655a2_d327_424f_ae72_6491fa466bdd.slice/crio-e43c0b9a22c900395e2b9cbf1afbb3db120228bd3cde97f270323bfcccc31ab6 WatchSource:0}: Error finding container e43c0b9a22c900395e2b9cbf1afbb3db120228bd3cde97f270323bfcccc31ab6: Status 404 returned error can't find the container with id e43c0b9a22c900395e2b9cbf1afbb3db120228bd3cde97f270323bfcccc31ab6 Mar 19 10:41:58 crc kubenswrapper[4765]: I0319 10:41:58.680042 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-507b-account-create-update-jxpl9"] Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.003034 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-create-6jl85"] Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.108765 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-create-h2h92"] Mar 19 10:41:59 crc kubenswrapper[4765]: W0319 10:41:59.163872 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod36534f7f_ff2e_45f0_8fc1_827ae9ebd32b.slice/crio-9c684371ab4119977e41ac6a959c8ed9a3dbb0b30a355028cfba95b3465480e7 WatchSource:0}: Error finding container 9c684371ab4119977e41ac6a959c8ed9a3dbb0b30a355028cfba95b3465480e7: Status 404 returned error can't find the container with id 9c684371ab4119977e41ac6a959c8ed9a3dbb0b30a355028cfba95b3465480e7 Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.271629 4765 generic.go:334] "Generic (PLEG): container finished" podID="a4914033-41f9-467d-b55e-c1d89a8fab4b" containerID="fe6d09544f88b41b3e7c804e567d48346b0bf23b5eaceb4e8474f71a9b463a43" exitCode=0 Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.272213 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6fk75" event={"ID":"a4914033-41f9-467d-b55e-c1d89a8fab4b","Type":"ContainerDied","Data":"fe6d09544f88b41b3e7c804e567d48346b0bf23b5eaceb4e8474f71a9b463a43"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.272256 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6fk75" event={"ID":"a4914033-41f9-467d-b55e-c1d89a8fab4b","Type":"ContainerStarted","Data":"f908844a9d75394e6134e0dac731f4e2be81667be90a6790235d1ab498ae875e"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.281580 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5c79d794d7-jfh7n" Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.285007 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-db-sync-597fd"] Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.290757 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h2h92" event={"ID":"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b","Type":"ContainerStarted","Data":"9c684371ab4119977e41ac6a959c8ed9a3dbb0b30a355028cfba95b3465480e7"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.300634 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-507b-account-create-update-jxpl9" event={"ID":"95479eb2-1db7-4def-a4da-5ce9dbf85e13","Type":"ContainerStarted","Data":"6db17d80b8214aed4015b5b0aaab9ec8ac9c4c385259ead38f242a8e12bd869a"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.301313 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-507b-account-create-update-jxpl9" event={"ID":"95479eb2-1db7-4def-a4da-5ce9dbf85e13","Type":"ContainerStarted","Data":"c3a0e9351a2f7f247d748518065231f17c124edce3ddec04f072c87d5b12312a"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.310744 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6jl85" event={"ID":"453b3d54-509f-4f11-a718-0bd8e271953e","Type":"ContainerStarted","Data":"7240e984b355badd42d196e20f37ee7b6f58e28b61dbdbfdf1db72bb5a666f61"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.320489 4765 generic.go:334] "Generic (PLEG): container finished" podID="549655a2-d327-424f-ae72-6491fa466bdd" containerID="f885113378798203ce2115b5b6acce36282db32d1ced42eb42999ef4341c8c4a" exitCode=0 Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.321357 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bedf-account-create-update-s9kdd" event={"ID":"549655a2-d327-424f-ae72-6491fa466bdd","Type":"ContainerDied","Data":"f885113378798203ce2115b5b6acce36282db32d1ced42eb42999ef4341c8c4a"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.321473 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bedf-account-create-update-s9kdd" event={"ID":"549655a2-d327-424f-ae72-6491fa466bdd","Type":"ContainerStarted","Data":"e43c0b9a22c900395e2b9cbf1afbb3db120228bd3cde97f270323bfcccc31ab6"} Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.368250 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-31fd-account-create-update-9b8kl"] Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.383766 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-507b-account-create-update-jxpl9" podStartSLOduration=2.383745763 podStartE2EDuration="2.383745763s" podCreationTimestamp="2026-03-19 10:41:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:41:59.354048697 +0000 UTC m=+1217.702994229" watchObservedRunningTime="2026-03-19 10:41:59.383745763 +0000 UTC m=+1217.732691305" Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.443966 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jfh7n"] Mar 19 10:41:59 crc kubenswrapper[4765]: I0319 10:41:59.455774 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5c79d794d7-jfh7n"] Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.141476 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565282-gnhzm"] Mar 19 10:42:00 crc kubenswrapper[4765]: E0319 10:42:00.142318 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b746149d-0502-49df-b998-44cc51484918" containerName="dnsmasq-dns" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.142337 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b746149d-0502-49df-b998-44cc51484918" containerName="dnsmasq-dns" Mar 19 10:42:00 crc kubenswrapper[4765]: E0319 10:42:00.142365 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b746149d-0502-49df-b998-44cc51484918" containerName="init" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.142373 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b746149d-0502-49df-b998-44cc51484918" containerName="init" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.142565 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b746149d-0502-49df-b998-44cc51484918" containerName="dnsmasq-dns" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.143432 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-gnhzm" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.145809 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.147015 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.147629 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.153503 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-gnhzm"] Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.237308 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tggnw\" (UniqueName: \"kubernetes.io/projected/4ea0d631-9296-4da6-96c3-a94b068052ed-kube-api-access-tggnw\") pod \"auto-csr-approver-29565282-gnhzm\" (UID: \"4ea0d631-9296-4da6-96c3-a94b068052ed\") " pod="openshift-infra/auto-csr-approver-29565282-gnhzm" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.331591 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-597fd" event={"ID":"4216cf70-5a7a-4d05-8c3d-20c4af295ac4","Type":"ContainerStarted","Data":"44bb943bd826c9fa0523639654583523ee540544cf1c37dfed7d0081ce558c6a"} Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.334328 4765 generic.go:334] "Generic (PLEG): container finished" podID="36534f7f-ff2e-45f0-8fc1-827ae9ebd32b" containerID="d522436d60bdd5fc0de249b55c07a37b11ad229d3cddb94b681dc057322aab8a" exitCode=0 Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.334371 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h2h92" event={"ID":"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b","Type":"ContainerDied","Data":"d522436d60bdd5fc0de249b55c07a37b11ad229d3cddb94b681dc057322aab8a"} Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.336420 4765 generic.go:334] "Generic (PLEG): container finished" podID="95479eb2-1db7-4def-a4da-5ce9dbf85e13" containerID="6db17d80b8214aed4015b5b0aaab9ec8ac9c4c385259ead38f242a8e12bd869a" exitCode=0 Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.336536 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-507b-account-create-update-jxpl9" event={"ID":"95479eb2-1db7-4def-a4da-5ce9dbf85e13","Type":"ContainerDied","Data":"6db17d80b8214aed4015b5b0aaab9ec8ac9c4c385259ead38f242a8e12bd869a"} Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.339052 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tggnw\" (UniqueName: \"kubernetes.io/projected/4ea0d631-9296-4da6-96c3-a94b068052ed-kube-api-access-tggnw\") pod \"auto-csr-approver-29565282-gnhzm\" (UID: \"4ea0d631-9296-4da6-96c3-a94b068052ed\") " pod="openshift-infra/auto-csr-approver-29565282-gnhzm" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.339617 4765 generic.go:334] "Generic (PLEG): container finished" podID="453b3d54-509f-4f11-a718-0bd8e271953e" containerID="b961b6ba6a0a5ead3cf0f18642f7b2af581727fa5ab5824f71ffa9873c8280fa" exitCode=0 Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.339771 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6jl85" event={"ID":"453b3d54-509f-4f11-a718-0bd8e271953e","Type":"ContainerDied","Data":"b961b6ba6a0a5ead3cf0f18642f7b2af581727fa5ab5824f71ffa9873c8280fa"} Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.341522 4765 generic.go:334] "Generic (PLEG): container finished" podID="7f66d34e-14df-45dc-b7a6-8a83c4f6b19f" containerID="a1ee19cb4fc979afebeb07647143bf1626b8b2049c582ded46bc58c26cd82cca" exitCode=0 Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.341576 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-31fd-account-create-update-9b8kl" event={"ID":"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f","Type":"ContainerDied","Data":"a1ee19cb4fc979afebeb07647143bf1626b8b2049c582ded46bc58c26cd82cca"} Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.341638 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-31fd-account-create-update-9b8kl" event={"ID":"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f","Type":"ContainerStarted","Data":"8b786fadf47a0e9b276cfa82b1df096e6b2395416366fbd3959da0e825c37dc1"} Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.364686 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tggnw\" (UniqueName: \"kubernetes.io/projected/4ea0d631-9296-4da6-96c3-a94b068052ed-kube-api-access-tggnw\") pod \"auto-csr-approver-29565282-gnhzm\" (UID: \"4ea0d631-9296-4da6-96c3-a94b068052ed\") " pod="openshift-infra/auto-csr-approver-29565282-gnhzm" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.372149 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b746149d-0502-49df-b998-44cc51484918" path="/var/lib/kubelet/pods/b746149d-0502-49df-b998-44cc51484918/volumes" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.466984 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-gnhzm" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.821732 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fk75" Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.957138 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4914033-41f9-467d-b55e-c1d89a8fab4b-operator-scripts\") pod \"a4914033-41f9-467d-b55e-c1d89a8fab4b\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.957296 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jldbj\" (UniqueName: \"kubernetes.io/projected/a4914033-41f9-467d-b55e-c1d89a8fab4b-kube-api-access-jldbj\") pod \"a4914033-41f9-467d-b55e-c1d89a8fab4b\" (UID: \"a4914033-41f9-467d-b55e-c1d89a8fab4b\") " Mar 19 10:42:00 crc kubenswrapper[4765]: I0319 10:42:00.958603 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a4914033-41f9-467d-b55e-c1d89a8fab4b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "a4914033-41f9-467d-b55e-c1d89a8fab4b" (UID: "a4914033-41f9-467d-b55e-c1d89a8fab4b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.005594 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4914033-41f9-467d-b55e-c1d89a8fab4b-kube-api-access-jldbj" (OuterVolumeSpecName: "kube-api-access-jldbj") pod "a4914033-41f9-467d-b55e-c1d89a8fab4b" (UID: "a4914033-41f9-467d-b55e-c1d89a8fab4b"). InnerVolumeSpecName "kube-api-access-jldbj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.066366 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/a4914033-41f9-467d-b55e-c1d89a8fab4b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.066409 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jldbj\" (UniqueName: \"kubernetes.io/projected/a4914033-41f9-467d-b55e-c1d89a8fab4b-kube-api-access-jldbj\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.153759 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.272882 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t4ctp\" (UniqueName: \"kubernetes.io/projected/549655a2-d327-424f-ae72-6491fa466bdd-kube-api-access-t4ctp\") pod \"549655a2-d327-424f-ae72-6491fa466bdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.272964 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/549655a2-d327-424f-ae72-6491fa466bdd-operator-scripts\") pod \"549655a2-d327-424f-ae72-6491fa466bdd\" (UID: \"549655a2-d327-424f-ae72-6491fa466bdd\") " Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.273805 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/549655a2-d327-424f-ae72-6491fa466bdd-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "549655a2-d327-424f-ae72-6491fa466bdd" (UID: "549655a2-d327-424f-ae72-6491fa466bdd"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.296420 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/549655a2-d327-424f-ae72-6491fa466bdd-kube-api-access-t4ctp" (OuterVolumeSpecName: "kube-api-access-t4ctp") pod "549655a2-d327-424f-ae72-6491fa466bdd" (UID: "549655a2-d327-424f-ae72-6491fa466bdd"). InnerVolumeSpecName "kube-api-access-t4ctp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.359648 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-bedf-account-create-update-s9kdd" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.359656 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-bedf-account-create-update-s9kdd" event={"ID":"549655a2-d327-424f-ae72-6491fa466bdd","Type":"ContainerDied","Data":"e43c0b9a22c900395e2b9cbf1afbb3db120228bd3cde97f270323bfcccc31ab6"} Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.360452 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e43c0b9a22c900395e2b9cbf1afbb3db120228bd3cde97f270323bfcccc31ab6" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.368623 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-create-6fk75" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.370067 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-create-6fk75" event={"ID":"a4914033-41f9-467d-b55e-c1d89a8fab4b","Type":"ContainerDied","Data":"f908844a9d75394e6134e0dac731f4e2be81667be90a6790235d1ab498ae875e"} Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.370123 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f908844a9d75394e6134e0dac731f4e2be81667be90a6790235d1ab498ae875e" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.375579 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t4ctp\" (UniqueName: \"kubernetes.io/projected/549655a2-d327-424f-ae72-6491fa466bdd-kube-api-access-t4ctp\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.375615 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/549655a2-d327-424f-ae72-6491fa466bdd-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.391355 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-gnhzm"] Mar 19 10:42:01 crc kubenswrapper[4765]: W0319 10:42:01.401336 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4ea0d631_9296_4da6_96c3_a94b068052ed.slice/crio-94bdb7273953ef8283814f1d3f3b7f7193259ee75c3ec4dd561977918f579a69 WatchSource:0}: Error finding container 94bdb7273953ef8283814f1d3f3b7f7193259ee75c3ec4dd561977918f579a69: Status 404 returned error can't find the container with id 94bdb7273953ef8283814f1d3f3b7f7193259ee75c3ec4dd561977918f579a69 Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.658310 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:42:01 crc kubenswrapper[4765]: I0319 10:42:01.658773 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.063439 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.212580 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mhj7\" (UniqueName: \"kubernetes.io/projected/95479eb2-1db7-4def-a4da-5ce9dbf85e13-kube-api-access-4mhj7\") pod \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.212658 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95479eb2-1db7-4def-a4da-5ce9dbf85e13-operator-scripts\") pod \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\" (UID: \"95479eb2-1db7-4def-a4da-5ce9dbf85e13\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.213997 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95479eb2-1db7-4def-a4da-5ce9dbf85e13-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95479eb2-1db7-4def-a4da-5ce9dbf85e13" (UID: "95479eb2-1db7-4def-a4da-5ce9dbf85e13"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.227297 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95479eb2-1db7-4def-a4da-5ce9dbf85e13-kube-api-access-4mhj7" (OuterVolumeSpecName: "kube-api-access-4mhj7") pod "95479eb2-1db7-4def-a4da-5ce9dbf85e13" (UID: "95479eb2-1db7-4def-a4da-5ce9dbf85e13"). InnerVolumeSpecName "kube-api-access-4mhj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.318917 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4mhj7\" (UniqueName: \"kubernetes.io/projected/95479eb2-1db7-4def-a4da-5ce9dbf85e13-kube-api-access-4mhj7\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.318968 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95479eb2-1db7-4def-a4da-5ce9dbf85e13-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.343794 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jl85" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.376785 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h2h92" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.376966 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.387704 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-507b-account-create-update-jxpl9" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.392695 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-507b-account-create-update-jxpl9" event={"ID":"95479eb2-1db7-4def-a4da-5ce9dbf85e13","Type":"ContainerDied","Data":"c3a0e9351a2f7f247d748518065231f17c124edce3ddec04f072c87d5b12312a"} Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.392739 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c3a0e9351a2f7f247d748518065231f17c124edce3ddec04f072c87d5b12312a" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.404013 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-create-6jl85" event={"ID":"453b3d54-509f-4f11-a718-0bd8e271953e","Type":"ContainerDied","Data":"7240e984b355badd42d196e20f37ee7b6f58e28b61dbdbfdf1db72bb5a666f61"} Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.404060 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7240e984b355badd42d196e20f37ee7b6f58e28b61dbdbfdf1db72bb5a666f61" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.404130 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-create-6jl85" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.420749 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzvhn\" (UniqueName: \"kubernetes.io/projected/453b3d54-509f-4f11-a718-0bd8e271953e-kube-api-access-lzvhn\") pod \"453b3d54-509f-4f11-a718-0bd8e271953e\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.421250 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8nknf\" (UniqueName: \"kubernetes.io/projected/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-kube-api-access-8nknf\") pod \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.421316 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-operator-scripts\") pod \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\" (UID: \"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.421342 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qfggb\" (UniqueName: \"kubernetes.io/projected/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-kube-api-access-qfggb\") pod \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.421371 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453b3d54-509f-4f11-a718-0bd8e271953e-operator-scripts\") pod \"453b3d54-509f-4f11-a718-0bd8e271953e\" (UID: \"453b3d54-509f-4f11-a718-0bd8e271953e\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.421398 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-operator-scripts\") pod \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\" (UID: \"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b\") " Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.422008 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "36534f7f-ff2e-45f0-8fc1-827ae9ebd32b" (UID: "36534f7f-ff2e-45f0-8fc1-827ae9ebd32b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.422019 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7f66d34e-14df-45dc-b7a6-8a83c4f6b19f" (UID: "7f66d34e-14df-45dc-b7a6-8a83c4f6b19f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.423260 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/453b3d54-509f-4f11-a718-0bd8e271953e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "453b3d54-509f-4f11-a718-0bd8e271953e" (UID: "453b3d54-509f-4f11-a718-0bd8e271953e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.426734 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-31fd-account-create-update-9b8kl" event={"ID":"7f66d34e-14df-45dc-b7a6-8a83c4f6b19f","Type":"ContainerDied","Data":"8b786fadf47a0e9b276cfa82b1df096e6b2395416366fbd3959da0e825c37dc1"} Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.426783 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8b786fadf47a0e9b276cfa82b1df096e6b2395416366fbd3959da0e825c37dc1" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.426887 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-31fd-account-create-update-9b8kl" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.427289 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/453b3d54-509f-4f11-a718-0bd8e271953e-kube-api-access-lzvhn" (OuterVolumeSpecName: "kube-api-access-lzvhn") pod "453b3d54-509f-4f11-a718-0bd8e271953e" (UID: "453b3d54-509f-4f11-a718-0bd8e271953e"). InnerVolumeSpecName "kube-api-access-lzvhn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.427614 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-kube-api-access-qfggb" (OuterVolumeSpecName: "kube-api-access-qfggb") pod "36534f7f-ff2e-45f0-8fc1-827ae9ebd32b" (UID: "36534f7f-ff2e-45f0-8fc1-827ae9ebd32b"). InnerVolumeSpecName "kube-api-access-qfggb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.430040 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565282-gnhzm" event={"ID":"4ea0d631-9296-4da6-96c3-a94b068052ed","Type":"ContainerStarted","Data":"94bdb7273953ef8283814f1d3f3b7f7193259ee75c3ec4dd561977918f579a69"} Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.453016 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-kube-api-access-8nknf" (OuterVolumeSpecName: "kube-api-access-8nknf") pod "7f66d34e-14df-45dc-b7a6-8a83c4f6b19f" (UID: "7f66d34e-14df-45dc-b7a6-8a83c4f6b19f"). InnerVolumeSpecName "kube-api-access-8nknf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.461000 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-create-h2h92" event={"ID":"36534f7f-ff2e-45f0-8fc1-827ae9ebd32b","Type":"ContainerDied","Data":"9c684371ab4119977e41ac6a959c8ed9a3dbb0b30a355028cfba95b3465480e7"} Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.461050 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9c684371ab4119977e41ac6a959c8ed9a3dbb0b30a355028cfba95b3465480e7" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.461176 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-create-h2h92" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.524358 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8nknf\" (UniqueName: \"kubernetes.io/projected/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-kube-api-access-8nknf\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.524390 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.524401 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qfggb\" (UniqueName: \"kubernetes.io/projected/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-kube-api-access-qfggb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.524411 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/453b3d54-509f-4f11-a718-0bd8e271953e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.524421 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:02 crc kubenswrapper[4765]: I0319 10:42:02.524430 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzvhn\" (UniqueName: \"kubernetes.io/projected/453b3d54-509f-4f11-a718-0bd8e271953e-kube-api-access-lzvhn\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.061334 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.170837 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2qgsd"] Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.176691 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" podUID="94883a00-ab76-403f-8733-9d2413012855" containerName="dnsmasq-dns" containerID="cri-o://12f794c628abcf06c7915cdc0ed4ebc02684fff5d50145b6e69b6f22c1da6822" gracePeriod=10 Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.530058 4765 generic.go:334] "Generic (PLEG): container finished" podID="94883a00-ab76-403f-8733-9d2413012855" containerID="12f794c628abcf06c7915cdc0ed4ebc02684fff5d50145b6e69b6f22c1da6822" exitCode=0 Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.530288 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" event={"ID":"94883a00-ab76-403f-8733-9d2413012855","Type":"ContainerDied","Data":"12f794c628abcf06c7915cdc0ed4ebc02684fff5d50145b6e69b6f22c1da6822"} Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.674775 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.740036 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-config\") pod \"94883a00-ab76-403f-8733-9d2413012855\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.740227 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-sb\") pod \"94883a00-ab76-403f-8733-9d2413012855\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.740280 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-dns-svc\") pod \"94883a00-ab76-403f-8733-9d2413012855\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.740344 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-nb\") pod \"94883a00-ab76-403f-8733-9d2413012855\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.740400 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tr65\" (UniqueName: \"kubernetes.io/projected/94883a00-ab76-403f-8733-9d2413012855-kube-api-access-6tr65\") pod \"94883a00-ab76-403f-8733-9d2413012855\" (UID: \"94883a00-ab76-403f-8733-9d2413012855\") " Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.748922 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/94883a00-ab76-403f-8733-9d2413012855-kube-api-access-6tr65" (OuterVolumeSpecName: "kube-api-access-6tr65") pod "94883a00-ab76-403f-8733-9d2413012855" (UID: "94883a00-ab76-403f-8733-9d2413012855"). InnerVolumeSpecName "kube-api-access-6tr65". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.788152 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-config" (OuterVolumeSpecName: "config") pod "94883a00-ab76-403f-8733-9d2413012855" (UID: "94883a00-ab76-403f-8733-9d2413012855"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.788196 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "94883a00-ab76-403f-8733-9d2413012855" (UID: "94883a00-ab76-403f-8733-9d2413012855"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.788211 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "94883a00-ab76-403f-8733-9d2413012855" (UID: "94883a00-ab76-403f-8733-9d2413012855"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.797626 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "94883a00-ab76-403f-8733-9d2413012855" (UID: "94883a00-ab76-403f-8733-9d2413012855"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.842192 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.842228 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.842239 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.842248 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6tr65\" (UniqueName: \"kubernetes.io/projected/94883a00-ab76-403f-8733-9d2413012855-kube-api-access-6tr65\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:06 crc kubenswrapper[4765]: I0319 10:42:06.842260 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/94883a00-ab76-403f-8733-9d2413012855-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.554090 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" event={"ID":"94883a00-ab76-403f-8733-9d2413012855","Type":"ContainerDied","Data":"a42d636193d071177e8016de59ffcb924d4536cebe07a3ef9cdd32a5080703b1"} Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.554389 4765 scope.go:117] "RemoveContainer" containerID="12f794c628abcf06c7915cdc0ed4ebc02684fff5d50145b6e69b6f22c1da6822" Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.554148 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-b8fbc5445-2qgsd" Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.556434 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-597fd" event={"ID":"4216cf70-5a7a-4d05-8c3d-20c4af295ac4","Type":"ContainerStarted","Data":"6ee19f54a1109bb1d46de72704b26d9f8cb61d2bf04b661e24ff583f45d89752"} Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.558864 4765 generic.go:334] "Generic (PLEG): container finished" podID="4ea0d631-9296-4da6-96c3-a94b068052ed" containerID="72065889be02d6207cab7936615f505cfd50e1668f8f850ab2de7523c9998cc5" exitCode=0 Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.558912 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565282-gnhzm" event={"ID":"4ea0d631-9296-4da6-96c3-a94b068052ed","Type":"ContainerDied","Data":"72065889be02d6207cab7936615f505cfd50e1668f8f850ab2de7523c9998cc5"} Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.577844 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-db-sync-597fd" podStartSLOduration=3.601062219 podStartE2EDuration="10.577810443s" podCreationTimestamp="2026-03-19 10:41:57 +0000 UTC" firstStartedPulling="2026-03-19 10:41:59.320302162 +0000 UTC m=+1217.669247704" lastFinishedPulling="2026-03-19 10:42:06.297050386 +0000 UTC m=+1224.645995928" observedRunningTime="2026-03-19 10:42:07.570027252 +0000 UTC m=+1225.918972804" watchObservedRunningTime="2026-03-19 10:42:07.577810443 +0000 UTC m=+1225.926755985" Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.599955 4765 scope.go:117] "RemoveContainer" containerID="239ab186dc20d74ad0b9369dff9ee09519beec1de4ea39f16dd3e86139c99cb6" Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.624880 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2qgsd"] Mar 19 10:42:07 crc kubenswrapper[4765]: I0319 10:42:07.633507 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-b8fbc5445-2qgsd"] Mar 19 10:42:08 crc kubenswrapper[4765]: I0319 10:42:08.366771 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="94883a00-ab76-403f-8733-9d2413012855" path="/var/lib/kubelet/pods/94883a00-ab76-403f-8733-9d2413012855/volumes" Mar 19 10:42:08 crc kubenswrapper[4765]: I0319 10:42:08.986390 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-gnhzm" Mar 19 10:42:09 crc kubenswrapper[4765]: I0319 10:42:09.083645 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tggnw\" (UniqueName: \"kubernetes.io/projected/4ea0d631-9296-4da6-96c3-a94b068052ed-kube-api-access-tggnw\") pod \"4ea0d631-9296-4da6-96c3-a94b068052ed\" (UID: \"4ea0d631-9296-4da6-96c3-a94b068052ed\") " Mar 19 10:42:09 crc kubenswrapper[4765]: I0319 10:42:09.091108 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ea0d631-9296-4da6-96c3-a94b068052ed-kube-api-access-tggnw" (OuterVolumeSpecName: "kube-api-access-tggnw") pod "4ea0d631-9296-4da6-96c3-a94b068052ed" (UID: "4ea0d631-9296-4da6-96c3-a94b068052ed"). InnerVolumeSpecName "kube-api-access-tggnw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:09 crc kubenswrapper[4765]: I0319 10:42:09.185336 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tggnw\" (UniqueName: \"kubernetes.io/projected/4ea0d631-9296-4da6-96c3-a94b068052ed-kube-api-access-tggnw\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:09 crc kubenswrapper[4765]: I0319 10:42:09.580400 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565282-gnhzm" event={"ID":"4ea0d631-9296-4da6-96c3-a94b068052ed","Type":"ContainerDied","Data":"94bdb7273953ef8283814f1d3f3b7f7193259ee75c3ec4dd561977918f579a69"} Mar 19 10:42:09 crc kubenswrapper[4765]: I0319 10:42:09.581003 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94bdb7273953ef8283814f1d3f3b7f7193259ee75c3ec4dd561977918f579a69" Mar 19 10:42:09 crc kubenswrapper[4765]: I0319 10:42:09.580485 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565282-gnhzm" Mar 19 10:42:10 crc kubenswrapper[4765]: I0319 10:42:10.057082 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-tfhl8"] Mar 19 10:42:10 crc kubenswrapper[4765]: I0319 10:42:10.064399 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565276-tfhl8"] Mar 19 10:42:10 crc kubenswrapper[4765]: I0319 10:42:10.367896 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d2bf5f7-9350-4366-8930-8a9383045a69" path="/var/lib/kubelet/pods/0d2bf5f7-9350-4366-8930-8a9383045a69/volumes" Mar 19 10:42:12 crc kubenswrapper[4765]: I0319 10:42:12.608397 4765 generic.go:334] "Generic (PLEG): container finished" podID="4216cf70-5a7a-4d05-8c3d-20c4af295ac4" containerID="6ee19f54a1109bb1d46de72704b26d9f8cb61d2bf04b661e24ff583f45d89752" exitCode=0 Mar 19 10:42:12 crc kubenswrapper[4765]: I0319 10:42:12.608491 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-597fd" event={"ID":"4216cf70-5a7a-4d05-8c3d-20c4af295ac4","Type":"ContainerDied","Data":"6ee19f54a1109bb1d46de72704b26d9f8cb61d2bf04b661e24ff583f45d89752"} Mar 19 10:42:13 crc kubenswrapper[4765]: I0319 10:42:13.951920 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-597fd" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.069560 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-config-data\") pod \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.069668 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-combined-ca-bundle\") pod \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.069737 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qk4jj\" (UniqueName: \"kubernetes.io/projected/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-kube-api-access-qk4jj\") pod \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\" (UID: \"4216cf70-5a7a-4d05-8c3d-20c4af295ac4\") " Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.076073 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-kube-api-access-qk4jj" (OuterVolumeSpecName: "kube-api-access-qk4jj") pod "4216cf70-5a7a-4d05-8c3d-20c4af295ac4" (UID: "4216cf70-5a7a-4d05-8c3d-20c4af295ac4"). InnerVolumeSpecName "kube-api-access-qk4jj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.099315 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "4216cf70-5a7a-4d05-8c3d-20c4af295ac4" (UID: "4216cf70-5a7a-4d05-8c3d-20c4af295ac4"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.116615 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-config-data" (OuterVolumeSpecName: "config-data") pod "4216cf70-5a7a-4d05-8c3d-20c4af295ac4" (UID: "4216cf70-5a7a-4d05-8c3d-20c4af295ac4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.173164 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.173240 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.173263 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qk4jj\" (UniqueName: \"kubernetes.io/projected/4216cf70-5a7a-4d05-8c3d-20c4af295ac4-kube-api-access-qk4jj\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.625433 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-db-sync-597fd" event={"ID":"4216cf70-5a7a-4d05-8c3d-20c4af295ac4","Type":"ContainerDied","Data":"44bb943bd826c9fa0523639654583523ee540544cf1c37dfed7d0081ce558c6a"} Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.625486 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="44bb943bd826c9fa0523639654583523ee540544cf1c37dfed7d0081ce558c6a" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.625506 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-db-sync-597fd" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.906860 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-btcwx"] Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907276 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4914033-41f9-467d-b55e-c1d89a8fab4b" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907294 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4914033-41f9-467d-b55e-c1d89a8fab4b" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907303 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94883a00-ab76-403f-8733-9d2413012855" containerName="init" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907310 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="94883a00-ab76-403f-8733-9d2413012855" containerName="init" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907321 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="94883a00-ab76-403f-8733-9d2413012855" containerName="dnsmasq-dns" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907338 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="94883a00-ab76-403f-8733-9d2413012855" containerName="dnsmasq-dns" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907348 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4216cf70-5a7a-4d05-8c3d-20c4af295ac4" containerName="keystone-db-sync" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907354 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4216cf70-5a7a-4d05-8c3d-20c4af295ac4" containerName="keystone-db-sync" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907369 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="453b3d54-509f-4f11-a718-0bd8e271953e" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907378 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="453b3d54-509f-4f11-a718-0bd8e271953e" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907388 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="549655a2-d327-424f-ae72-6491fa466bdd" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907395 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="549655a2-d327-424f-ae72-6491fa466bdd" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907404 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7f66d34e-14df-45dc-b7a6-8a83c4f6b19f" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907411 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7f66d34e-14df-45dc-b7a6-8a83c4f6b19f" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907425 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ea0d631-9296-4da6-96c3-a94b068052ed" containerName="oc" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907433 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ea0d631-9296-4da6-96c3-a94b068052ed" containerName="oc" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907444 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36534f7f-ff2e-45f0-8fc1-827ae9ebd32b" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907450 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="36534f7f-ff2e-45f0-8fc1-827ae9ebd32b" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: E0319 10:42:14.907460 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95479eb2-1db7-4def-a4da-5ce9dbf85e13" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907465 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="95479eb2-1db7-4def-a4da-5ce9dbf85e13" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907614 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4216cf70-5a7a-4d05-8c3d-20c4af295ac4" containerName="keystone-db-sync" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907626 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="94883a00-ab76-403f-8733-9d2413012855" containerName="dnsmasq-dns" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907641 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="95479eb2-1db7-4def-a4da-5ce9dbf85e13" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907648 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ea0d631-9296-4da6-96c3-a94b068052ed" containerName="oc" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907656 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7f66d34e-14df-45dc-b7a6-8a83c4f6b19f" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907662 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="36534f7f-ff2e-45f0-8fc1-827ae9ebd32b" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907673 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="549655a2-d327-424f-ae72-6491fa466bdd" containerName="mariadb-account-create-update" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907683 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4914033-41f9-467d-b55e-c1d89a8fab4b" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.907703 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="453b3d54-509f-4f11-a718-0bd8e271953e" containerName="mariadb-database-create" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.908558 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.926508 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-rhcqb"] Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.927740 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.938287 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.938604 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zvnkb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.938644 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.939223 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.941551 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.941716 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rhcqb"] Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.952931 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-btcwx"] Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.996817 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-combined-ca-bundle\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997260 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997315 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7t9lb\" (UniqueName: \"kubernetes.io/projected/4ca14f03-d78d-4343-820a-b1d5f937fa4c-kube-api-access-7t9lb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997350 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-config-data\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997390 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-config\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997447 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdc24\" (UniqueName: \"kubernetes.io/projected/16f0a7af-1034-49f2-852f-c48f260f56fb-kube-api-access-hdc24\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997509 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-scripts\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997557 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-credential-keys\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997603 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997637 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-fernet-keys\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997666 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:14 crc kubenswrapper[4765]: I0319 10:42:14.997761 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102456 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-credential-keys\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102512 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102538 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-fernet-keys\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102556 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102611 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102636 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-combined-ca-bundle\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102669 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102697 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7t9lb\" (UniqueName: \"kubernetes.io/projected/4ca14f03-d78d-4343-820a-b1d5f937fa4c-kube-api-access-7t9lb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102715 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-config-data\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102742 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-config\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102772 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdc24\" (UniqueName: \"kubernetes.io/projected/16f0a7af-1034-49f2-852f-c48f260f56fb-kube-api-access-hdc24\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.102807 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-scripts\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.103922 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-swift-storage-0\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.104633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-config\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.105755 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-svc\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.112246 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-combined-ca-bundle\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.115549 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6b8fd97f7-54zm7"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.121692 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-fernet-keys\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.126647 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.129366 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-config-data\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.129538 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-sb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.129934 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-nb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.129947 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-credential-keys\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.135629 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon-horizon-dockercfg-ndr2h" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.135747 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-scripts\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.135883 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-scripts" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.135934 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"horizon-config-data" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.136082 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"horizon" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.146739 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7t9lb\" (UniqueName: \"kubernetes.io/projected/4ca14f03-d78d-4343-820a-b1d5f937fa4c-kube-api-access-7t9lb\") pod \"dnsmasq-dns-bbf5cc879-btcwx\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.185450 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b8fd97f7-54zm7"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.208457 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-config-data\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.208560 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-logs\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.208852 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-scripts\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.208909 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-horizon-secret-key\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.209052 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbnfj\" (UniqueName: \"kubernetes.io/projected/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-kube-api-access-mbnfj\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.228495 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.233036 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdc24\" (UniqueName: \"kubernetes.io/projected/16f0a7af-1034-49f2-852f-c48f260f56fb-kube-api-access-hdc24\") pod \"keystone-bootstrap-rhcqb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.248654 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.285308 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-db-sync-mbhfs"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.286890 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.296138 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7z8jd" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.296322 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.296455 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313033 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-config-data\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313085 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tn9jt\" (UniqueName: \"kubernetes.io/projected/21c84c7a-03f0-4ab5-a259-95a351cbdf13-kube-api-access-tn9jt\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313119 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-logs\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313162 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-config-data\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313191 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-combined-ca-bundle\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313217 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-scripts\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313237 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-horizon-secret-key\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313282 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbnfj\" (UniqueName: \"kubernetes.io/projected/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-kube-api-access-mbnfj\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313305 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-scripts\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313332 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-db-sync-config-data\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.313363 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c84c7a-03f0-4ab5-a259-95a351cbdf13-etc-machine-id\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.314667 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-config-data\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.314922 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-logs\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.315325 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-scripts\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.321441 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-db-sync-q2fr9"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.336939 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.350614 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-horizon-secret-key\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.359020 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-scripts" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.359322 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-config-data" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.359460 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"placement-placement-dockercfg-mnc59" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.364526 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mbhfs"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.401538 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q2fr9"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.405501 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbnfj\" (UniqueName: \"kubernetes.io/projected/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-kube-api-access-mbnfj\") pod \"horizon-6b8fd97f7-54zm7\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.419608 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-config-data\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.420824 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-combined-ca-bundle\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.421172 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-scripts\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.421619 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-scripts\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.421878 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-db-sync-config-data\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.429331 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-scripts\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.430237 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c84c7a-03f0-4ab5-a259-95a351cbdf13-etc-machine-id\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.430992 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c84c7a-03f0-4ab5-a259-95a351cbdf13-etc-machine-id\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.431070 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tn9jt\" (UniqueName: \"kubernetes.io/projected/21c84c7a-03f0-4ab5-a259-95a351cbdf13-kube-api-access-tn9jt\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.431163 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed90710f-8437-4621-b01e-a78cb4f0a96c-logs\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.431330 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx7g9\" (UniqueName: \"kubernetes.io/projected/ed90710f-8437-4621-b01e-a78cb4f0a96c-kube-api-access-dx7g9\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.431869 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-combined-ca-bundle\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.431947 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.437911 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-combined-ca-bundle\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.451801 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-db-sync-config-data\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.453137 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-config-data\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.454815 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-db-sync-jjjh8"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.461540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.468846 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.469139 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.469307 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-p7tmf" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.487763 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tn9jt\" (UniqueName: \"kubernetes.io/projected/21c84c7a-03f0-4ab5-a259-95a351cbdf13-kube-api-access-tn9jt\") pod \"cinder-db-sync-mbhfs\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535303 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-combined-ca-bundle\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535353 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535392 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-config\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535447 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-scripts\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535542 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsdwl\" (UniqueName: \"kubernetes.io/projected/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-kube-api-access-fsdwl\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535569 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-combined-ca-bundle\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535631 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed90710f-8437-4621-b01e-a78cb4f0a96c-logs\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.535656 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dx7g9\" (UniqueName: \"kubernetes.io/projected/ed90710f-8437-4621-b01e-a78cb4f0a96c-kube-api-access-dx7g9\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.553141 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed90710f-8437-4621-b01e-a78cb4f0a96c-logs\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.558003 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-scripts\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.562177 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jjjh8"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.565582 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-combined-ca-bundle\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.596469 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx7g9\" (UniqueName: \"kubernetes.io/projected/ed90710f-8437-4621-b01e-a78cb4f0a96c-kube-api-access-dx7g9\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.604775 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.609342 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data\") pod \"placement-db-sync-q2fr9\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.623053 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.628796 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.632430 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.632915 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.637117 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fsdwl\" (UniqueName: \"kubernetes.io/projected/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-kube-api-access-fsdwl\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.637177 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-combined-ca-bundle\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.637294 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-config\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.653850 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-config\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.660710 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-combined-ca-bundle\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.670806 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-btcwx"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.685076 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.705763 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-844b88bcb9-74x2j"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.707709 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.708248 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.719159 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsdwl\" (UniqueName: \"kubernetes.io/projected/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-kube-api-access-fsdwl\") pod \"neutron-db-sync-jjjh8\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.732148 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747299 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-config-data\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747437 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-log-httpd\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747480 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-scripts\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747576 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-scripts\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747620 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxkwx\" (UniqueName: \"kubernetes.io/projected/e8a2ac88-8b8a-485c-b952-d89590f71f68-kube-api-access-zxkwx\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747675 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8a2ac88-8b8a-485c-b952-d89590f71f68-horizon-secret-key\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747727 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vm6s\" (UniqueName: \"kubernetes.io/projected/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-kube-api-access-9vm6s\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747768 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a2ac88-8b8a-485c-b952-d89590f71f68-logs\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747810 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747896 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747943 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-run-httpd\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.747986 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.778848 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-db-sync-cczjt"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.802465 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.804746 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.813346 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f4tjw" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.813854 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.854500 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-log-httpd\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.854641 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-scripts\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.855511 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-log-httpd\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.855585 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-scripts\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.855642 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxkwx\" (UniqueName: \"kubernetes.io/projected/e8a2ac88-8b8a-485c-b952-d89590f71f68-kube-api-access-zxkwx\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.855693 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-db-sync-config-data\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.878858 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8a2ac88-8b8a-485c-b952-d89590f71f68-horizon-secret-key\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.878955 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vm6s\" (UniqueName: \"kubernetes.io/projected/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-kube-api-access-9vm6s\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.878990 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-combined-ca-bundle\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.879045 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a2ac88-8b8a-485c-b952-d89590f71f68-logs\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.879099 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.879193 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.884043 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-run-httpd\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.884094 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.884159 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-config-data\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.884186 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wds8\" (UniqueName: \"kubernetes.io/projected/e4012221-7e3d-4ee8-9c90-e564931f5a30-kube-api-access-7wds8\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.887565 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a2ac88-8b8a-485c-b952-d89590f71f68-logs\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.887081 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-run-httpd\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.889552 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-config-data\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.892515 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8a2ac88-8b8a-485c-b952-d89590f71f68-horizon-secret-key\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.899376 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-scripts\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.904012 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.904909 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.916936 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.919861 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-scripts\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.929243 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cczjt"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.930663 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxkwx\" (UniqueName: \"kubernetes.io/projected/e8a2ac88-8b8a-485c-b952-d89590f71f68-kube-api-access-zxkwx\") pod \"horizon-844b88bcb9-74x2j\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.931105 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vm6s\" (UniqueName: \"kubernetes.io/projected/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-kube-api-access-9vm6s\") pod \"ceilometer-0\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " pod="openstack/ceilometer-0" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.944021 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844b88bcb9-74x2j"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.965660 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-nvlgk"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.967439 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.977991 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-nvlgk"] Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.990633 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wds8\" (UniqueName: \"kubernetes.io/projected/e4012221-7e3d-4ee8-9c90-e564931f5a30-kube-api-access-7wds8\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.990883 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-db-sync-config-data\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:15 crc kubenswrapper[4765]: I0319 10:42:15.990986 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-combined-ca-bundle\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.000484 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-combined-ca-bundle\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.007330 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.014066 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.015761 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-db-sync-config-data\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.021239 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.031564 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.031738 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xrfsf" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.032281 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.032544 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.032669 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.041444 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wds8\" (UniqueName: \"kubernetes.io/projected/e4012221-7e3d-4ee8-9c90-e564931f5a30-kube-api-access-7wds8\") pod \"barbican-db-sync-cczjt\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.044973 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.065755 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.067506 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.072104 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.072658 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095345 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2nm4j\" (UniqueName: \"kubernetes.io/projected/10618530-67d6-40d7-94a2-4d956874d442-kube-api-access-2nm4j\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095420 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2xfd\" (UniqueName: \"kubernetes.io/projected/cb60c064-bde6-44f0-bc52-0da1205a7561-kube-api-access-l2xfd\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095523 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-config-data\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095563 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095599 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-scripts\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095635 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095672 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095713 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095737 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095774 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-config\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095821 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095852 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-logs\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095891 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.095927 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.105301 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.140989 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-rhcqb"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.190104 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cczjt" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197017 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l2xfd\" (UniqueName: \"kubernetes.io/projected/cb60c064-bde6-44f0-bc52-0da1205a7561-kube-api-access-l2xfd\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197083 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197116 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197145 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-config-data\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197171 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197190 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pgbj5\" (UniqueName: \"kubernetes.io/projected/106dc5f6-567f-4876-a58f-ce99f120f564-kube-api-access-pgbj5\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197211 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-scripts\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197233 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197250 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197269 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-scripts\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197291 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197324 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197345 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197367 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-config\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197390 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-logs\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197416 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197439 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-logs\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197469 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197494 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197518 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-config-data\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197543 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.197572 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2nm4j\" (UniqueName: \"kubernetes.io/projected/10618530-67d6-40d7-94a2-4d956874d442-kube-api-access-2nm4j\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.199149 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-svc\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.199748 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-sb\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.200371 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-nb\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.200921 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-config\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.201173 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.204577 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.204613 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-logs\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.208127 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-config-data\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.215569 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-swift-storage-0\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.221590 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.228916 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l2xfd\" (UniqueName: \"kubernetes.io/projected/cb60c064-bde6-44f0-bc52-0da1205a7561-kube-api-access-l2xfd\") pod \"dnsmasq-dns-56df8fb6b7-nvlgk\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.233677 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-scripts\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.234477 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2nm4j\" (UniqueName: \"kubernetes.io/projected/10618530-67d6-40d7-94a2-4d956874d442-kube-api-access-2nm4j\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.236540 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.250358 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.279054 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-btcwx"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.298721 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.298789 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pgbj5\" (UniqueName: \"kubernetes.io/projected/106dc5f6-567f-4876-a58f-ce99f120f564-kube-api-access-pgbj5\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.298823 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.298851 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-scripts\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.298908 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-logs\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.298949 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.298993 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-config-data\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.299056 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.303531 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.303861 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-logs\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.303896 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.304584 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.309741 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-scripts\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.310716 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-config-data\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.311462 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.322493 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.330539 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pgbj5\" (UniqueName: \"kubernetes.io/projected/106dc5f6-567f-4876-a58f-ce99f120f564-kube-api-access-pgbj5\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.351694 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.405451 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.445007 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.454342 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6b8fd97f7-54zm7"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.502121 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.598153 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-db-sync-mbhfs"] Mar 19 10:42:16 crc kubenswrapper[4765]: W0319 10:42:16.604129 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poded90710f_8437_4621_b01e_a78cb4f0a96c.slice/crio-c05b9477bcc89ed9ff849704d2af14832e002e4970ae81d969d88cf36882e00c WatchSource:0}: Error finding container c05b9477bcc89ed9ff849704d2af14832e002e4970ae81d969d88cf36882e00c: Status 404 returned error can't find the container with id c05b9477bcc89ed9ff849704d2af14832e002e4970ae81d969d88cf36882e00c Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.619830 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-db-sync-q2fr9"] Mar 19 10:42:16 crc kubenswrapper[4765]: W0319 10:42:16.789176 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfb34ed2a_1bbe_4ee9_9c00_7c1c340a868e.slice/crio-2b03b80b028fc4c98f9bf244f0d1b2cd85e26dede99cb3387a758fab567c0a2f WatchSource:0}: Error finding container 2b03b80b028fc4c98f9bf244f0d1b2cd85e26dede99cb3387a758fab567c0a2f: Status 404 returned error can't find the container with id 2b03b80b028fc4c98f9bf244f0d1b2cd85e26dede99cb3387a758fab567c0a2f Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.790477 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b8fd97f7-54zm7" event={"ID":"2790b039-8c4f-4fe6-9b79-c7bf16532cd1","Type":"ContainerStarted","Data":"c435d9a7cbbe1ea098b4965925bce7fc0b99b07a0cb09eb7e0d6abb253cd5cbd"} Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.794929 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-db-sync-jjjh8"] Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.804850 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2fr9" event={"ID":"ed90710f-8437-4621-b01e-a78cb4f0a96c","Type":"ContainerStarted","Data":"c05b9477bcc89ed9ff849704d2af14832e002e4970ae81d969d88cf36882e00c"} Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.817732 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" event={"ID":"4ca14f03-d78d-4343-820a-b1d5f937fa4c","Type":"ContainerStarted","Data":"86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d"} Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.817788 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" event={"ID":"4ca14f03-d78d-4343-820a-b1d5f937fa4c","Type":"ContainerStarted","Data":"edbcfcced9a55eb829ed73fb21bc5c1cd41f8bcc227172e70c2c55619fcb725b"} Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.854765 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhcqb" event={"ID":"16f0a7af-1034-49f2-852f-c48f260f56fb","Type":"ContainerStarted","Data":"33fb7a0188d2230ab9609cf706f229cd7f7d88f877b33adcacd7e8705e07dfcb"} Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.854826 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhcqb" event={"ID":"16f0a7af-1034-49f2-852f-c48f260f56fb","Type":"ContainerStarted","Data":"de5fcb41a03eadaf93b14679048be61ba5082ea8e0c2d7d20bc0c93cc6afeb08"} Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.872358 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mbhfs" event={"ID":"21c84c7a-03f0-4ab5-a259-95a351cbdf13","Type":"ContainerStarted","Data":"0c37c357db57e9b538ce00318541aba7846373220b57be0d3e67f96109aaaf3c"} Mar 19 10:42:16 crc kubenswrapper[4765]: I0319 10:42:16.900618 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-rhcqb" podStartSLOduration=2.900595667 podStartE2EDuration="2.900595667s" podCreationTimestamp="2026-03-19 10:42:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:16.883067471 +0000 UTC m=+1235.232013013" watchObservedRunningTime="2026-03-19 10:42:16.900595667 +0000 UTC m=+1235.249541209" Mar 19 10:42:17 crc kubenswrapper[4765]: I0319 10:42:17.008009 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-844b88bcb9-74x2j"] Mar 19 10:42:17 crc kubenswrapper[4765]: W0319 10:42:17.008918 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfc731af7_c5a0_4d4e_9f33_9deec0f322ee.slice/crio-404ca7642462232243d4052d8da91384de2d1401b2c4aae33dc6940f76aa75ab WatchSource:0}: Error finding container 404ca7642462232243d4052d8da91384de2d1401b2c4aae33dc6940f76aa75ab: Status 404 returned error can't find the container with id 404ca7642462232243d4052d8da91384de2d1401b2c4aae33dc6940f76aa75ab Mar 19 10:42:17 crc kubenswrapper[4765]: W0319 10:42:17.009404 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode8a2ac88_8b8a_485c_b952_d89590f71f68.slice/crio-0489165d62c1c4edb545bd4c3d89e5bec738aa0717f66d471d675642db58f737 WatchSource:0}: Error finding container 0489165d62c1c4edb545bd4c3d89e5bec738aa0717f66d471d675642db58f737: Status 404 returned error can't find the container with id 0489165d62c1c4edb545bd4c3d89e5bec738aa0717f66d471d675642db58f737 Mar 19 10:42:17 crc kubenswrapper[4765]: I0319 10:42:17.024424 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:42:17 crc kubenswrapper[4765]: I0319 10:42:17.331844 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:17 crc kubenswrapper[4765]: I0319 10:42:17.382700 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-db-sync-cczjt"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.431392 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-nvlgk"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.557942 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.655721 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.743562 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-sb\") pod \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.743612 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-nb\") pod \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.743655 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-config\") pod \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.743721 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-swift-storage-0\") pod \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.743792 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-svc\") pod \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.743883 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7t9lb\" (UniqueName: \"kubernetes.io/projected/4ca14f03-d78d-4343-820a-b1d5f937fa4c-kube-api-access-7t9lb\") pod \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\" (UID: \"4ca14f03-d78d-4343-820a-b1d5f937fa4c\") " Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.764172 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4ca14f03-d78d-4343-820a-b1d5f937fa4c-kube-api-access-7t9lb" (OuterVolumeSpecName: "kube-api-access-7t9lb") pod "4ca14f03-d78d-4343-820a-b1d5f937fa4c" (UID: "4ca14f03-d78d-4343-820a-b1d5f937fa4c"). InnerVolumeSpecName "kube-api-access-7t9lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.833509 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.878136 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "4ca14f03-d78d-4343-820a-b1d5f937fa4c" (UID: "4ca14f03-d78d-4343-820a-b1d5f937fa4c"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.878943 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-config" (OuterVolumeSpecName: "config") pod "4ca14f03-d78d-4343-820a-b1d5f937fa4c" (UID: "4ca14f03-d78d-4343-820a-b1d5f937fa4c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.890507 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7t9lb\" (UniqueName: \"kubernetes.io/projected/4ca14f03-d78d-4343-820a-b1d5f937fa4c-kube-api-access-7t9lb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.890539 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.890554 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.910877 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "4ca14f03-d78d-4343-820a-b1d5f937fa4c" (UID: "4ca14f03-d78d-4343-820a-b1d5f937fa4c"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.918774 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "4ca14f03-d78d-4343-820a-b1d5f937fa4c" (UID: "4ca14f03-d78d-4343-820a-b1d5f937fa4c"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.919524 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "4ca14f03-d78d-4343-820a-b1d5f937fa4c" (UID: "4ca14f03-d78d-4343-820a-b1d5f937fa4c"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.989381 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjjh8" event={"ID":"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e","Type":"ContainerStarted","Data":"a0e7bd5f7d8c30648a07ded5e04182fb4b1b36a99584c14b6806d67eba09527b"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.989448 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjjh8" event={"ID":"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e","Type":"ContainerStarted","Data":"2b03b80b028fc4c98f9bf244f0d1b2cd85e26dede99cb3387a758fab567c0a2f"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.991745 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.991766 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.991778 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/4ca14f03-d78d-4343-820a-b1d5f937fa4c-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:17.992036 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b8fd97f7-54zm7"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.006497 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.026444 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cczjt" event={"ID":"e4012221-7e3d-4ee8-9c90-e564931f5a30","Type":"ContainerStarted","Data":"649c5d1dbced6925beea6beac3206676ae8b823fcfe7ca4f4fefd392137720f1"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.038878 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-754cf44cd5-5c6mc"] Mar 19 10:42:18 crc kubenswrapper[4765]: E0319 10:42:18.039375 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4ca14f03-d78d-4343-820a-b1d5f937fa4c" containerName="init" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.039390 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="4ca14f03-d78d-4343-820a-b1d5f937fa4c" containerName="init" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.039582 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="4ca14f03-d78d-4343-820a-b1d5f937fa4c" containerName="init" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.040528 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.048077 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"106dc5f6-567f-4876-a58f-ce99f120f564","Type":"ContainerStarted","Data":"aeef183052cfdec52376f375ff256a3ef4174b1c43078e596d57a77517d6e0a6"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.052833 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10618530-67d6-40d7-94a2-4d956874d442","Type":"ContainerStarted","Data":"c3075dc09cd494d6359238de12b21e25e44495db1e63bcf4caadac115c09786c"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.053909 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.055008 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b88bcb9-74x2j" event={"ID":"e8a2ac88-8b8a-485c-b952-d89590f71f68","Type":"ContainerStarted","Data":"0489165d62c1c4edb545bd4c3d89e5bec738aa0717f66d471d675642db58f737"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.056324 4765 generic.go:334] "Generic (PLEG): container finished" podID="4ca14f03-d78d-4343-820a-b1d5f937fa4c" containerID="86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d" exitCode=0 Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.056379 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" event={"ID":"4ca14f03-d78d-4343-820a-b1d5f937fa4c","Type":"ContainerDied","Data":"86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.056401 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" event={"ID":"4ca14f03-d78d-4343-820a-b1d5f937fa4c","Type":"ContainerDied","Data":"edbcfcced9a55eb829ed73fb21bc5c1cd41f8bcc227172e70c2c55619fcb725b"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.056421 4765 scope.go:117] "RemoveContainer" containerID="86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.056576 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bbf5cc879-btcwx" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.081209 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerStarted","Data":"404ca7642462232243d4052d8da91384de2d1401b2c4aae33dc6940f76aa75ab"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.082107 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-754cf44cd5-5c6mc"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.093298 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9006a2a-0e22-4e7a-ac04-dffa6500b246-logs\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.093442 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9006a2a-0e22-4e7a-ac04-dffa6500b246-horizon-secret-key\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.093529 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lk6zd\" (UniqueName: \"kubernetes.io/projected/b9006a2a-0e22-4e7a-ac04-dffa6500b246-kube-api-access-lk6zd\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.093576 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-scripts\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.093610 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-config-data\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.095831 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" event={"ID":"cb60c064-bde6-44f0-bc52-0da1205a7561","Type":"ContainerStarted","Data":"ab11f5eeff91ed252a3c4753f59e53435c14f7f52a70a3098fbd1036ff609d03"} Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.109500 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-db-sync-jjjh8" podStartSLOduration=3.109466633 podStartE2EDuration="3.109466633s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:18.022840674 +0000 UTC m=+1236.371786236" watchObservedRunningTime="2026-03-19 10:42:18.109466633 +0000 UTC m=+1236.458412175" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.155790 4765 scope.go:117] "RemoveContainer" containerID="86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d" Mar 19 10:42:18 crc kubenswrapper[4765]: E0319 10:42:18.157418 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d\": container with ID starting with 86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d not found: ID does not exist" containerID="86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.157461 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d"} err="failed to get container status \"86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d\": rpc error: code = NotFound desc = could not find container \"86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d\": container with ID starting with 86d4ad001a8e070cc2aec31ad87cb780b7ded2d84a6091f998f9f937861c945d not found: ID does not exist" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.197191 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9006a2a-0e22-4e7a-ac04-dffa6500b246-horizon-secret-key\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.197313 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lk6zd\" (UniqueName: \"kubernetes.io/projected/b9006a2a-0e22-4e7a-ac04-dffa6500b246-kube-api-access-lk6zd\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.197373 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-scripts\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.197401 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-config-data\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.197443 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9006a2a-0e22-4e7a-ac04-dffa6500b246-logs\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.204823 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-config-data\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.205181 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9006a2a-0e22-4e7a-ac04-dffa6500b246-logs\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.207439 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-scripts\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.219225 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9006a2a-0e22-4e7a-ac04-dffa6500b246-horizon-secret-key\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.231486 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lk6zd\" (UniqueName: \"kubernetes.io/projected/b9006a2a-0e22-4e7a-ac04-dffa6500b246-kube-api-access-lk6zd\") pod \"horizon-754cf44cd5-5c6mc\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.380744 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.414936 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-btcwx"] Mar 19 10:42:18 crc kubenswrapper[4765]: I0319 10:42:18.415066 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bbf5cc879-btcwx"] Mar 19 10:42:19 crc kubenswrapper[4765]: I0319 10:42:19.182072 4765 generic.go:334] "Generic (PLEG): container finished" podID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerID="1412cf4a5fbc04b01e4c26a9f36eb1e8fec5df7dee65c94e7086b0c7b5eecedd" exitCode=0 Mar 19 10:42:19 crc kubenswrapper[4765]: I0319 10:42:19.183382 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" event={"ID":"cb60c064-bde6-44f0-bc52-0da1205a7561","Type":"ContainerDied","Data":"1412cf4a5fbc04b01e4c26a9f36eb1e8fec5df7dee65c94e7086b0c7b5eecedd"} Mar 19 10:42:19 crc kubenswrapper[4765]: I0319 10:42:19.300626 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-754cf44cd5-5c6mc"] Mar 19 10:42:19 crc kubenswrapper[4765]: W0319 10:42:19.402495 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb9006a2a_0e22_4e7a_ac04_dffa6500b246.slice/crio-dc15badf0d9838c6284f85cf1866087f59b7d191fa203d16f5bd1252c5b1ffa7 WatchSource:0}: Error finding container dc15badf0d9838c6284f85cf1866087f59b7d191fa203d16f5bd1252c5b1ffa7: Status 404 returned error can't find the container with id dc15badf0d9838c6284f85cf1866087f59b7d191fa203d16f5bd1252c5b1ffa7 Mar 19 10:42:20 crc kubenswrapper[4765]: I0319 10:42:20.196982 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-754cf44cd5-5c6mc" event={"ID":"b9006a2a-0e22-4e7a-ac04-dffa6500b246","Type":"ContainerStarted","Data":"dc15badf0d9838c6284f85cf1866087f59b7d191fa203d16f5bd1252c5b1ffa7"} Mar 19 10:42:20 crc kubenswrapper[4765]: I0319 10:42:20.209913 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" event={"ID":"cb60c064-bde6-44f0-bc52-0da1205a7561","Type":"ContainerStarted","Data":"85fece65b39d9fceec6e98c184220bd7dff7c03217c3ee3d8fc7ee66dd43aa4e"} Mar 19 10:42:20 crc kubenswrapper[4765]: I0319 10:42:20.211380 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:20 crc kubenswrapper[4765]: I0319 10:42:20.217661 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"106dc5f6-567f-4876-a58f-ce99f120f564","Type":"ContainerStarted","Data":"a945b6261e43806009e442d791d0d2ef539275b33aff2fa75923fd4e488057ba"} Mar 19 10:42:20 crc kubenswrapper[4765]: I0319 10:42:20.225626 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10618530-67d6-40d7-94a2-4d956874d442","Type":"ContainerStarted","Data":"c93c7a99bca21658e395392407b022c02419266ea29cfa9bf6731a980e641a2b"} Mar 19 10:42:20 crc kubenswrapper[4765]: I0319 10:42:20.235819 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" podStartSLOduration=5.235796444 podStartE2EDuration="5.235796444s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:20.231858357 +0000 UTC m=+1238.580803929" watchObservedRunningTime="2026-03-19 10:42:20.235796444 +0000 UTC m=+1238.584741996" Mar 19 10:42:20 crc kubenswrapper[4765]: I0319 10:42:20.377140 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ca14f03-d78d-4343-820a-b1d5f937fa4c" path="/var/lib/kubelet/pods/4ca14f03-d78d-4343-820a-b1d5f937fa4c/volumes" Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.248645 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10618530-67d6-40d7-94a2-4d956874d442","Type":"ContainerStarted","Data":"e0a58e50be5daa44b0442d628ee0cce2884bee4a5b3abf1f741073311e13a34d"} Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.248747 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-log" containerID="cri-o://c93c7a99bca21658e395392407b022c02419266ea29cfa9bf6731a980e641a2b" gracePeriod=30 Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.249018 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-httpd" containerID="cri-o://e0a58e50be5daa44b0442d628ee0cce2884bee4a5b3abf1f741073311e13a34d" gracePeriod=30 Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.252656 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"106dc5f6-567f-4876-a58f-ce99f120f564","Type":"ContainerStarted","Data":"1f579a1ee5befabd98dbdbf85af712c987f0f9f6b406a853d04964d19b2abff5"} Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.252751 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-log" containerID="cri-o://a945b6261e43806009e442d791d0d2ef539275b33aff2fa75923fd4e488057ba" gracePeriod=30 Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.252792 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-httpd" containerID="cri-o://1f579a1ee5befabd98dbdbf85af712c987f0f9f6b406a853d04964d19b2abff5" gracePeriod=30 Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.353366 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.353343133 podStartE2EDuration="6.353343133s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:21.317672106 +0000 UTC m=+1239.666617658" watchObservedRunningTime="2026-03-19 10:42:21.353343133 +0000 UTC m=+1239.702288675" Mar 19 10:42:21 crc kubenswrapper[4765]: I0319 10:42:21.354157 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=6.354152445 podStartE2EDuration="6.354152445s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:21.297436207 +0000 UTC m=+1239.646381749" watchObservedRunningTime="2026-03-19 10:42:21.354152445 +0000 UTC m=+1239.703097987" Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.275838 4765 generic.go:334] "Generic (PLEG): container finished" podID="106dc5f6-567f-4876-a58f-ce99f120f564" containerID="1f579a1ee5befabd98dbdbf85af712c987f0f9f6b406a853d04964d19b2abff5" exitCode=0 Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.276213 4765 generic.go:334] "Generic (PLEG): container finished" podID="106dc5f6-567f-4876-a58f-ce99f120f564" containerID="a945b6261e43806009e442d791d0d2ef539275b33aff2fa75923fd4e488057ba" exitCode=143 Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.275911 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"106dc5f6-567f-4876-a58f-ce99f120f564","Type":"ContainerDied","Data":"1f579a1ee5befabd98dbdbf85af712c987f0f9f6b406a853d04964d19b2abff5"} Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.276261 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"106dc5f6-567f-4876-a58f-ce99f120f564","Type":"ContainerDied","Data":"a945b6261e43806009e442d791d0d2ef539275b33aff2fa75923fd4e488057ba"} Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.281407 4765 generic.go:334] "Generic (PLEG): container finished" podID="10618530-67d6-40d7-94a2-4d956874d442" containerID="e0a58e50be5daa44b0442d628ee0cce2884bee4a5b3abf1f741073311e13a34d" exitCode=0 Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.281435 4765 generic.go:334] "Generic (PLEG): container finished" podID="10618530-67d6-40d7-94a2-4d956874d442" containerID="c93c7a99bca21658e395392407b022c02419266ea29cfa9bf6731a980e641a2b" exitCode=143 Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.281550 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10618530-67d6-40d7-94a2-4d956874d442","Type":"ContainerDied","Data":"e0a58e50be5daa44b0442d628ee0cce2884bee4a5b3abf1f741073311e13a34d"} Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.281637 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10618530-67d6-40d7-94a2-4d956874d442","Type":"ContainerDied","Data":"c93c7a99bca21658e395392407b022c02419266ea29cfa9bf6731a980e641a2b"} Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.283980 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhcqb" event={"ID":"16f0a7af-1034-49f2-852f-c48f260f56fb","Type":"ContainerDied","Data":"33fb7a0188d2230ab9609cf706f229cd7f7d88f877b33adcacd7e8705e07dfcb"} Mar 19 10:42:22 crc kubenswrapper[4765]: I0319 10:42:22.283983 4765 generic.go:334] "Generic (PLEG): container finished" podID="16f0a7af-1034-49f2-852f-c48f260f56fb" containerID="33fb7a0188d2230ab9609cf706f229cd7f7d88f877b33adcacd7e8705e07dfcb" exitCode=0 Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.708091 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844b88bcb9-74x2j"] Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.738402 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-5c6bdcb6fb-89kxv"] Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.740190 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.745891 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-horizon-svc" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.780736 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c6bdcb6fb-89kxv"] Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.858337 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-754cf44cd5-5c6mc"] Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.872651 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-tls-certs\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.872740 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-config-data\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.872782 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-scripts\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.872808 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hphkq\" (UniqueName: \"kubernetes.io/projected/5112f66b-28fa-4500-b77b-351b8c3d0519-kube-api-access-hphkq\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.872828 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-combined-ca-bundle\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.872903 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-secret-key\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.872922 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5112f66b-28fa-4500-b77b-351b8c3d0519-logs\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.900339 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/horizon-6c6ff5646d-fmdz2"] Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.902045 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.923052 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c6ff5646d-fmdz2"] Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974434 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xkrqq\" (UniqueName: \"kubernetes.io/projected/b506e362-44bf-4267-bea0-18131aa011fa-kube-api-access-xkrqq\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974522 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-config-data\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974577 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-combined-ca-bundle\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974609 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-scripts\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974648 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hphkq\" (UniqueName: \"kubernetes.io/projected/5112f66b-28fa-4500-b77b-351b8c3d0519-kube-api-access-hphkq\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974675 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-combined-ca-bundle\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974704 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-horizon-secret-key\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974750 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b506e362-44bf-4267-bea0-18131aa011fa-logs\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974801 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b506e362-44bf-4267-bea0-18131aa011fa-config-data\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974848 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-secret-key\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974871 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5112f66b-28fa-4500-b77b-351b8c3d0519-logs\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974915 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b506e362-44bf-4267-bea0-18131aa011fa-scripts\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.974943 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-tls-certs\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.975002 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-horizon-tls-certs\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.976459 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-config-data\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.977258 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-scripts\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.977651 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5112f66b-28fa-4500-b77b-351b8c3d0519-logs\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.985507 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-secret-key\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.986703 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-combined-ca-bundle\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:24 crc kubenswrapper[4765]: I0319 10:42:24.992010 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-tls-certs\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.000636 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hphkq\" (UniqueName: \"kubernetes.io/projected/5112f66b-28fa-4500-b77b-351b8c3d0519-kube-api-access-hphkq\") pod \"horizon-5c6bdcb6fb-89kxv\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.069886 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077012 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b506e362-44bf-4267-bea0-18131aa011fa-config-data\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077144 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b506e362-44bf-4267-bea0-18131aa011fa-scripts\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077183 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-horizon-tls-certs\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077225 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xkrqq\" (UniqueName: \"kubernetes.io/projected/b506e362-44bf-4267-bea0-18131aa011fa-kube-api-access-xkrqq\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077276 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-combined-ca-bundle\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077314 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-horizon-secret-key\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077349 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b506e362-44bf-4267-bea0-18131aa011fa-logs\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.077809 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b506e362-44bf-4267-bea0-18131aa011fa-logs\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.079357 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b506e362-44bf-4267-bea0-18131aa011fa-config-data\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.079427 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b506e362-44bf-4267-bea0-18131aa011fa-scripts\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.082783 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-horizon-secret-key\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.083909 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-combined-ca-bundle\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.091624 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/b506e362-44bf-4267-bea0-18131aa011fa-horizon-tls-certs\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.112673 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xkrqq\" (UniqueName: \"kubernetes.io/projected/b506e362-44bf-4267-bea0-18131aa011fa-kube-api-access-xkrqq\") pod \"horizon-6c6ff5646d-fmdz2\" (UID: \"b506e362-44bf-4267-bea0-18131aa011fa\") " pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:25 crc kubenswrapper[4765]: I0319 10:42:25.230755 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:26 crc kubenswrapper[4765]: I0319 10:42:26.312310 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:42:26 crc kubenswrapper[4765]: I0319 10:42:26.404601 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-fwp2t"] Mar 19 10:42:26 crc kubenswrapper[4765]: I0319 10:42:26.405005 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" containerID="cri-o://371b1829bc0d6f74ef0efb4b0612ae0419ac2477162a39935d459530c1b4c6ba" gracePeriod=10 Mar 19 10:42:27 crc kubenswrapper[4765]: I0319 10:42:27.339155 4765 generic.go:334] "Generic (PLEG): container finished" podID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerID="371b1829bc0d6f74ef0efb4b0612ae0419ac2477162a39935d459530c1b4c6ba" exitCode=0 Mar 19 10:42:27 crc kubenswrapper[4765]: I0319 10:42:27.339200 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" event={"ID":"c66aa28a-5eba-4c64-abec-75b6577131a4","Type":"ContainerDied","Data":"371b1829bc0d6f74ef0efb4b0612ae0419ac2477162a39935d459530c1b4c6ba"} Mar 19 10:42:31 crc kubenswrapper[4765]: I0319 10:42:31.060159 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 19 10:42:31 crc kubenswrapper[4765]: E0319 10:42:31.221585 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified" Mar 19 10:42:31 crc kubenswrapper[4765]: E0319 10:42:31.221824 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:barbican-db-sync,Image:quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified,Command:[/bin/bash],Args:[-c barbican-manage db upgrade],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/barbican/barbican.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7wds8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42403,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:*42403,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod barbican-db-sync-cczjt_openstack(e4012221-7e3d-4ee8-9c90-e564931f5a30): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:42:31 crc kubenswrapper[4765]: E0319 10:42:31.223840 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/barbican-db-sync-cczjt" podUID="e4012221-7e3d-4ee8-9c90-e564931f5a30" Mar 19 10:42:31 crc kubenswrapper[4765]: I0319 10:42:31.655906 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:42:31 crc kubenswrapper[4765]: I0319 10:42:31.656292 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:42:31 crc kubenswrapper[4765]: E0319 10:42:31.727261 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"barbican-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-barbican-api:current-podified\\\"\"" pod="openstack/barbican-db-sync-cczjt" podUID="e4012221-7e3d-4ee8-9c90-e564931f5a30" Mar 19 10:42:36 crc kubenswrapper[4765]: I0319 10:42:36.059359 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: connect: connection refused" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.170651 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.171197 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n54dh59bh5cdh59h685h58ch68h56fh697h68bhdbh56h677h64h647h56dh687h7fh5bbh54bh58bh5f5h696hbch6fh586h55bh557h58dh646h55bh659q,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-mbnfj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-6b8fd97f7-54zm7_openstack(2790b039-8c4f-4fe6-9b79-c7bf16532cd1): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.175005 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-6b8fd97f7-54zm7" podUID="2790b039-8c4f-4fe6-9b79-c7bf16532cd1" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.193610 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.193892 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nc7h655h577h8bhd5h8bh5fbh596h57bhfch5dh55dh9hc5h5fch648hf4h574h5fch55bh54fh5bfh5c4h558hb7hcch696h9h57bh9dh676hdq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-zxkwx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-844b88bcb9-74x2j_openstack(e8a2ac88-8b8a-485c-b952-d89590f71f68): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.197681 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-844b88bcb9-74x2j" podUID="e8a2ac88-8b8a-485c-b952-d89590f71f68" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.205072 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-horizon:current-podified" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.205289 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:horizon-log,Image:quay.io/podified-antelope-centos9/openstack-horizon:current-podified,Command:[/bin/bash],Args:[-c tail -n+1 -F /var/log/horizon/horizon.log],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:n68fh644hfdh564h5c7hb9h5f9h57bh57fh565h5b7hbbhc4h94h57bh74h596h5b5h65dh9ch75hb7h596h674hbdh5bfhc9h5ddh5d6hdh94h9cq,ValueFrom:nil,},EnvVar{Name:ENABLE_DESIGNATE,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_HEAT,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_IRONIC,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_MANILA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_OCTAVIA,Value:yes,ValueFrom:nil,},EnvVar{Name:ENABLE_WATCHER,Value:no,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},EnvVar{Name:UNPACK_THEME,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:logs,ReadOnly:false,MountPath:/var/log/horizon,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-lk6zd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*48,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*42400,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod horizon-754cf44cd5-5c6mc_openstack(b9006a2a-0e22-4e7a-ac04-dffa6500b246): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:42:38 crc kubenswrapper[4765]: E0319 10:42:38.207422 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"horizon-log\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\", failed to \"StartContainer\" for \"horizon\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-horizon:current-podified\\\"\"]" pod="openstack/horizon-754cf44cd5-5c6mc" podUID="b9006a2a-0e22-4e7a-ac04-dffa6500b246" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.277189 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.379344 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-credential-keys\") pod \"16f0a7af-1034-49f2-852f-c48f260f56fb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.379758 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-config-data\") pod \"16f0a7af-1034-49f2-852f-c48f260f56fb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.379792 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hdc24\" (UniqueName: \"kubernetes.io/projected/16f0a7af-1034-49f2-852f-c48f260f56fb-kube-api-access-hdc24\") pod \"16f0a7af-1034-49f2-852f-c48f260f56fb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.379842 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-fernet-keys\") pod \"16f0a7af-1034-49f2-852f-c48f260f56fb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.379867 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-scripts\") pod \"16f0a7af-1034-49f2-852f-c48f260f56fb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.379893 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-combined-ca-bundle\") pod \"16f0a7af-1034-49f2-852f-c48f260f56fb\" (UID: \"16f0a7af-1034-49f2-852f-c48f260f56fb\") " Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.387395 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f0a7af-1034-49f2-852f-c48f260f56fb-kube-api-access-hdc24" (OuterVolumeSpecName: "kube-api-access-hdc24") pod "16f0a7af-1034-49f2-852f-c48f260f56fb" (UID: "16f0a7af-1034-49f2-852f-c48f260f56fb"). InnerVolumeSpecName "kube-api-access-hdc24". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.388026 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "16f0a7af-1034-49f2-852f-c48f260f56fb" (UID: "16f0a7af-1034-49f2-852f-c48f260f56fb"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.388368 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "16f0a7af-1034-49f2-852f-c48f260f56fb" (UID: "16f0a7af-1034-49f2-852f-c48f260f56fb"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.410724 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-scripts" (OuterVolumeSpecName: "scripts") pod "16f0a7af-1034-49f2-852f-c48f260f56fb" (UID: "16f0a7af-1034-49f2-852f-c48f260f56fb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.419093 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-config-data" (OuterVolumeSpecName: "config-data") pod "16f0a7af-1034-49f2-852f-c48f260f56fb" (UID: "16f0a7af-1034-49f2-852f-c48f260f56fb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.425264 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "16f0a7af-1034-49f2-852f-c48f260f56fb" (UID: "16f0a7af-1034-49f2-852f-c48f260f56fb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.483572 4765 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.483619 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.483633 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hdc24\" (UniqueName: \"kubernetes.io/projected/16f0a7af-1034-49f2-852f-c48f260f56fb-kube-api-access-hdc24\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.483650 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.483663 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.483676 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/16f0a7af-1034-49f2-852f-c48f260f56fb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.793204 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-rhcqb" event={"ID":"16f0a7af-1034-49f2-852f-c48f260f56fb","Type":"ContainerDied","Data":"de5fcb41a03eadaf93b14679048be61ba5082ea8e0c2d7d20bc0c93cc6afeb08"} Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.793269 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="de5fcb41a03eadaf93b14679048be61ba5082ea8e0c2d7d20bc0c93cc6afeb08" Mar 19 10:42:38 crc kubenswrapper[4765]: I0319 10:42:38.793338 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-rhcqb" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.364473 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-rhcqb"] Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.372845 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-rhcqb"] Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.473820 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-bootstrap-j8hqn"] Mar 19 10:42:39 crc kubenswrapper[4765]: E0319 10:42:39.474275 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="16f0a7af-1034-49f2-852f-c48f260f56fb" containerName="keystone-bootstrap" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.474293 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f0a7af-1034-49f2-852f-c48f260f56fb" containerName="keystone-bootstrap" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.474465 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="16f0a7af-1034-49f2-852f-c48f260f56fb" containerName="keystone-bootstrap" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.475097 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.478032 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.478373 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.478726 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zvnkb" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.478908 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.479128 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.483809 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j8hqn"] Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.603375 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2d82\" (UniqueName: \"kubernetes.io/projected/2ec7bc5e-c876-4f51-8135-166f8ea45721-kube-api-access-s2d82\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.603484 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-fernet-keys\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.603601 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-combined-ca-bundle\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.603689 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-scripts\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.603871 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-config-data\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.604073 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-credential-keys\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.705546 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-credential-keys\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.705667 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2d82\" (UniqueName: \"kubernetes.io/projected/2ec7bc5e-c876-4f51-8135-166f8ea45721-kube-api-access-s2d82\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.705692 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-fernet-keys\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.705767 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-combined-ca-bundle\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.705835 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-scripts\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.705922 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-config-data\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.712025 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-combined-ca-bundle\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.717016 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-fernet-keys\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.718498 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-config-data\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.723363 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-scripts\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.723809 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2d82\" (UniqueName: \"kubernetes.io/projected/2ec7bc5e-c876-4f51-8135-166f8ea45721-kube-api-access-s2d82\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.729407 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-credential-keys\") pod \"keystone-bootstrap-j8hqn\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:39 crc kubenswrapper[4765]: I0319 10:42:39.815032 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:40 crc kubenswrapper[4765]: I0319 10:42:40.371099 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f0a7af-1034-49f2-852f-c48f260f56fb" path="/var/lib/kubelet/pods/16f0a7af-1034-49f2-852f-c48f260f56fb/volumes" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.065735 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.066911 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.174653 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.195023 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.206984 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.214598 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.230448 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.231210 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbnfj\" (UniqueName: \"kubernetes.io/projected/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-kube-api-access-mbnfj\") pod \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.232423 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-config-data\") pod \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.232477 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-logs\") pod \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.232992 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-scripts\") pod \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.233036 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-horizon-secret-key\") pod \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\" (UID: \"2790b039-8c4f-4fe6-9b79-c7bf16532cd1\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.233583 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-scripts" (OuterVolumeSpecName: "scripts") pod "2790b039-8c4f-4fe6-9b79-c7bf16532cd1" (UID: "2790b039-8c4f-4fe6-9b79-c7bf16532cd1"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.233617 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-logs" (OuterVolumeSpecName: "logs") pod "2790b039-8c4f-4fe6-9b79-c7bf16532cd1" (UID: "2790b039-8c4f-4fe6-9b79-c7bf16532cd1"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.234263 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-config-data" (OuterVolumeSpecName: "config-data") pod "2790b039-8c4f-4fe6-9b79-c7bf16532cd1" (UID: "2790b039-8c4f-4fe6-9b79-c7bf16532cd1"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.234820 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.234848 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.234860 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.236787 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-kube-api-access-mbnfj" (OuterVolumeSpecName: "kube-api-access-mbnfj") pod "2790b039-8c4f-4fe6-9b79-c7bf16532cd1" (UID: "2790b039-8c4f-4fe6-9b79-c7bf16532cd1"). InnerVolumeSpecName "kube-api-access-mbnfj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.237122 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "2790b039-8c4f-4fe6-9b79-c7bf16532cd1" (UID: "2790b039-8c4f-4fe6-9b79-c7bf16532cd1"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.328037 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336393 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-combined-ca-bundle\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336458 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lk6zd\" (UniqueName: \"kubernetes.io/projected/b9006a2a-0e22-4e7a-ac04-dffa6500b246-kube-api-access-lk6zd\") pod \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336489 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8a2ac88-8b8a-485c-b952-d89590f71f68-horizon-secret-key\") pod \"e8a2ac88-8b8a-485c-b952-d89590f71f68\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336568 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2nm4j\" (UniqueName: \"kubernetes.io/projected/10618530-67d6-40d7-94a2-4d956874d442-kube-api-access-2nm4j\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336584 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-config-data\") pod \"e8a2ac88-8b8a-485c-b952-d89590f71f68\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336610 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a2ac88-8b8a-485c-b952-d89590f71f68-logs\") pod \"e8a2ac88-8b8a-485c-b952-d89590f71f68\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336625 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-scripts\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336655 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-scripts\") pod \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336687 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-config-data\") pod \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336743 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-scripts\") pod \"e8a2ac88-8b8a-485c-b952-d89590f71f68\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336767 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-httpd-run\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336808 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-nb\") pod \"c66aa28a-5eba-4c64-abec-75b6577131a4\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336831 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336853 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxkwx\" (UniqueName: \"kubernetes.io/projected/e8a2ac88-8b8a-485c-b952-d89590f71f68-kube-api-access-zxkwx\") pod \"e8a2ac88-8b8a-485c-b952-d89590f71f68\" (UID: \"e8a2ac88-8b8a-485c-b952-d89590f71f68\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336887 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-swift-storage-0\") pod \"c66aa28a-5eba-4c64-abec-75b6577131a4\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336915 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fd79b\" (UniqueName: \"kubernetes.io/projected/c66aa28a-5eba-4c64-abec-75b6577131a4-kube-api-access-fd79b\") pod \"c66aa28a-5eba-4c64-abec-75b6577131a4\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.336973 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-public-tls-certs\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.337070 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-sb\") pod \"c66aa28a-5eba-4c64-abec-75b6577131a4\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.340062 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-logs\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.340103 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-config\") pod \"c66aa28a-5eba-4c64-abec-75b6577131a4\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.340156 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-svc\") pod \"c66aa28a-5eba-4c64-abec-75b6577131a4\" (UID: \"c66aa28a-5eba-4c64-abec-75b6577131a4\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.340181 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9006a2a-0e22-4e7a-ac04-dffa6500b246-horizon-secret-key\") pod \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.340227 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9006a2a-0e22-4e7a-ac04-dffa6500b246-logs\") pod \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\" (UID: \"b9006a2a-0e22-4e7a-ac04-dffa6500b246\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.340265 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-config-data\") pod \"10618530-67d6-40d7-94a2-4d956874d442\" (UID: \"10618530-67d6-40d7-94a2-4d956874d442\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.341393 4765 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.341420 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbnfj\" (UniqueName: \"kubernetes.io/projected/2790b039-8c4f-4fe6-9b79-c7bf16532cd1-kube-api-access-mbnfj\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.337070 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8a2ac88-8b8a-485c-b952-d89590f71f68-logs" (OuterVolumeSpecName: "logs") pod "e8a2ac88-8b8a-485c-b952-d89590f71f68" (UID: "e8a2ac88-8b8a-485c-b952-d89590f71f68"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.337715 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-config-data" (OuterVolumeSpecName: "config-data") pod "e8a2ac88-8b8a-485c-b952-d89590f71f68" (UID: "e8a2ac88-8b8a-485c-b952-d89590f71f68"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.337749 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-scripts" (OuterVolumeSpecName: "scripts") pod "e8a2ac88-8b8a-485c-b952-d89590f71f68" (UID: "e8a2ac88-8b8a-485c-b952-d89590f71f68"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.338256 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-scripts" (OuterVolumeSpecName: "scripts") pod "b9006a2a-0e22-4e7a-ac04-dffa6500b246" (UID: "b9006a2a-0e22-4e7a-ac04-dffa6500b246"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.337893 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-config-data" (OuterVolumeSpecName: "config-data") pod "b9006a2a-0e22-4e7a-ac04-dffa6500b246" (UID: "b9006a2a-0e22-4e7a-ac04-dffa6500b246"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.338585 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.339738 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8a2ac88-8b8a-485c-b952-d89590f71f68-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "e8a2ac88-8b8a-485c-b952-d89590f71f68" (UID: "e8a2ac88-8b8a-485c-b952-d89590f71f68"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.341414 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b9006a2a-0e22-4e7a-ac04-dffa6500b246-kube-api-access-lk6zd" (OuterVolumeSpecName: "kube-api-access-lk6zd") pod "b9006a2a-0e22-4e7a-ac04-dffa6500b246" (UID: "b9006a2a-0e22-4e7a-ac04-dffa6500b246"). InnerVolumeSpecName "kube-api-access-lk6zd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.342566 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/10618530-67d6-40d7-94a2-4d956874d442-kube-api-access-2nm4j" (OuterVolumeSpecName: "kube-api-access-2nm4j") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "kube-api-access-2nm4j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.343292 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-scripts" (OuterVolumeSpecName: "scripts") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.344073 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c66aa28a-5eba-4c64-abec-75b6577131a4-kube-api-access-fd79b" (OuterVolumeSpecName: "kube-api-access-fd79b") pod "c66aa28a-5eba-4c64-abec-75b6577131a4" (UID: "c66aa28a-5eba-4c64-abec-75b6577131a4"). InnerVolumeSpecName "kube-api-access-fd79b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.345769 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b9006a2a-0e22-4e7a-ac04-dffa6500b246-logs" (OuterVolumeSpecName: "logs") pod "b9006a2a-0e22-4e7a-ac04-dffa6500b246" (UID: "b9006a2a-0e22-4e7a-ac04-dffa6500b246"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.354618 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8a2ac88-8b8a-485c-b952-d89590f71f68-kube-api-access-zxkwx" (OuterVolumeSpecName: "kube-api-access-zxkwx") pod "e8a2ac88-8b8a-485c-b952-d89590f71f68" (UID: "e8a2ac88-8b8a-485c-b952-d89590f71f68"). InnerVolumeSpecName "kube-api-access-zxkwx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.355222 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-logs" (OuterVolumeSpecName: "logs") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.368931 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.396649 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.403684 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b9006a2a-0e22-4e7a-ac04-dffa6500b246-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "b9006a2a-0e22-4e7a-ac04-dffa6500b246" (UID: "b9006a2a-0e22-4e7a-ac04-dffa6500b246"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.428317 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "c66aa28a-5eba-4c64-abec-75b6577131a4" (UID: "c66aa28a-5eba-4c64-abec-75b6577131a4"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.431717 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-config" (OuterVolumeSpecName: "config") pod "c66aa28a-5eba-4c64-abec-75b6577131a4" (UID: "c66aa28a-5eba-4c64-abec-75b6577131a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.432217 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-config-data" (OuterVolumeSpecName: "config-data") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.434317 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "c66aa28a-5eba-4c64-abec-75b6577131a4" (UID: "c66aa28a-5eba-4c64-abec-75b6577131a4"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.442431 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.442491 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-httpd-run\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.442721 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-config-data\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443022 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-internal-tls-certs\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443095 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-logs\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443173 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-scripts\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443213 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pgbj5\" (UniqueName: \"kubernetes.io/projected/106dc5f6-567f-4876-a58f-ce99f120f564-kube-api-access-pgbj5\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443242 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-combined-ca-bundle\") pod \"106dc5f6-567f-4876-a58f-ce99f120f564\" (UID: \"106dc5f6-567f-4876-a58f-ce99f120f564\") " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443626 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443644 4765 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443656 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443674 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443684 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxkwx\" (UniqueName: \"kubernetes.io/projected/e8a2ac88-8b8a-485c-b952-d89590f71f68-kube-api-access-zxkwx\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443696 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fd79b\" (UniqueName: \"kubernetes.io/projected/c66aa28a-5eba-4c64-abec-75b6577131a4-kube-api-access-fd79b\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443707 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/10618530-67d6-40d7-94a2-4d956874d442-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443719 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443731 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443742 4765 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/b9006a2a-0e22-4e7a-ac04-dffa6500b246-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443751 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b9006a2a-0e22-4e7a-ac04-dffa6500b246-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443759 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443769 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443777 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lk6zd\" (UniqueName: \"kubernetes.io/projected/b9006a2a-0e22-4e7a-ac04-dffa6500b246-kube-api-access-lk6zd\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443786 4765 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/e8a2ac88-8b8a-485c-b952-d89590f71f68-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443795 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/e8a2ac88-8b8a-485c-b952-d89590f71f68-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443804 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2nm4j\" (UniqueName: \"kubernetes.io/projected/10618530-67d6-40d7-94a2-4d956874d442-kube-api-access-2nm4j\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443813 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8a2ac88-8b8a-485c-b952-d89590f71f68-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443821 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443830 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443840 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/b9006a2a-0e22-4e7a-ac04-dffa6500b246-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.443884 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.444821 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-logs" (OuterVolumeSpecName: "logs") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.447195 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.450788 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/106dc5f6-567f-4876-a58f-ce99f120f564-kube-api-access-pgbj5" (OuterVolumeSpecName: "kube-api-access-pgbj5") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "kube-api-access-pgbj5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.452350 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "c66aa28a-5eba-4c64-abec-75b6577131a4" (UID: "c66aa28a-5eba-4c64-abec-75b6577131a4"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.453484 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "c66aa28a-5eba-4c64-abec-75b6577131a4" (UID: "c66aa28a-5eba-4c64-abec-75b6577131a4"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.454052 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "10618530-67d6-40d7-94a2-4d956874d442" (UID: "10618530-67d6-40d7-94a2-4d956874d442"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.457982 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-scripts" (OuterVolumeSpecName: "scripts") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.465586 4765 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.470906 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.495607 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-config-data" (OuterVolumeSpecName: "config-data") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.500436 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "106dc5f6-567f-4876-a58f-ce99f120f564" (UID: "106dc5f6-567f-4876-a58f-ce99f120f564"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545137 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545364 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pgbj5\" (UniqueName: \"kubernetes.io/projected/106dc5f6-567f-4876-a58f-ce99f120f564-kube-api-access-pgbj5\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545450 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545531 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545638 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545718 4765 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545778 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545834 4765 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545883 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/c66aa28a-5eba-4c64-abec-75b6577131a4-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.545935 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/106dc5f6-567f-4876-a58f-ce99f120f564-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.546204 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/10618530-67d6-40d7-94a2-4d956874d442-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.546261 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/106dc5f6-567f-4876-a58f-ce99f120f564-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.567210 4765 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.648250 4765 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.866673 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6b8fd97f7-54zm7" event={"ID":"2790b039-8c4f-4fe6-9b79-c7bf16532cd1","Type":"ContainerDied","Data":"c435d9a7cbbe1ea098b4965925bce7fc0b99b07a0cb09eb7e0d6abb253cd5cbd"} Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.866708 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-6b8fd97f7-54zm7" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.869902 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"106dc5f6-567f-4876-a58f-ce99f120f564","Type":"ContainerDied","Data":"aeef183052cfdec52376f375ff256a3ef4174b1c43078e596d57a77517d6e0a6"} Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.869929 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.869942 4765 scope.go:117] "RemoveContainer" containerID="1f579a1ee5befabd98dbdbf85af712c987f0f9f6b406a853d04964d19b2abff5" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.872736 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.872762 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"10618530-67d6-40d7-94a2-4d956874d442","Type":"ContainerDied","Data":"c3075dc09cd494d6359238de12b21e25e44495db1e63bcf4caadac115c09786c"} Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.874902 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-844b88bcb9-74x2j" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.874903 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-844b88bcb9-74x2j" event={"ID":"e8a2ac88-8b8a-485c-b952-d89590f71f68","Type":"ContainerDied","Data":"0489165d62c1c4edb545bd4c3d89e5bec738aa0717f66d471d675642db58f737"} Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.876466 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-754cf44cd5-5c6mc" event={"ID":"b9006a2a-0e22-4e7a-ac04-dffa6500b246","Type":"ContainerDied","Data":"dc15badf0d9838c6284f85cf1866087f59b7d191fa203d16f5bd1252c5b1ffa7"} Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.876505 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-754cf44cd5-5c6mc" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.881290 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" event={"ID":"c66aa28a-5eba-4c64-abec-75b6577131a4","Type":"ContainerDied","Data":"616e3b1f592660803ed84428701c5cb1fade2c75955d53b01d14e5b61416273e"} Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.883793 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.935615 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-6b8fd97f7-54zm7"] Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.945077 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-6b8fd97f7-54zm7"] Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.952984 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.961614 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973288 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:46 crc kubenswrapper[4765]: E0319 10:42:46.973673 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-httpd" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973690 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-httpd" Mar 19 10:42:46 crc kubenswrapper[4765]: E0319 10:42:46.973703 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973710 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" Mar 19 10:42:46 crc kubenswrapper[4765]: E0319 10:42:46.973721 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-log" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973728 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-log" Mar 19 10:42:46 crc kubenswrapper[4765]: E0319 10:42:46.973740 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-log" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973746 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-log" Mar 19 10:42:46 crc kubenswrapper[4765]: E0319 10:42:46.973757 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="init" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973764 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="init" Mar 19 10:42:46 crc kubenswrapper[4765]: E0319 10:42:46.973775 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-httpd" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973781 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-httpd" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973936 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-log" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973975 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973987 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-log" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.973995 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" containerName="glance-httpd" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.974001 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="10618530-67d6-40d7-94a2-4d956874d442" containerName="glance-httpd" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.974872 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.981237 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.981554 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-scripts" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.981690 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-glance-dockercfg-xrfsf" Mar 19 10:42:46 crc kubenswrapper[4765]: I0319 10:42:46.981825 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.005584 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.050862 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-844b88bcb9-74x2j"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.054793 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.065427 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.066183 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.066857 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.067397 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.067534 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk68w\" (UniqueName: \"kubernetes.io/projected/a083bcfd-87a7-43f7-b0a3-1180bea648b3-kube-api-access-xk68w\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.067657 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.071548 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.083482 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-844b88bcb9-74x2j"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.112858 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-754cf44cd5-5c6mc"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.123448 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-754cf44cd5-5c6mc"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.133444 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-fwp2t"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.156586 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-5f59b8f679-fwp2t"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178032 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178107 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178132 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178194 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178222 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178245 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178298 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk68w\" (UniqueName: \"kubernetes.io/projected/a083bcfd-87a7-43f7-b0a3-1180bea648b3-kube-api-access-xk68w\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.178333 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.179633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.179633 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-logs\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.179747 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.181881 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.183252 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.187226 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.189438 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.195460 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.197203 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.200698 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk68w\" (UniqueName: \"kubernetes.io/projected/a083bcfd-87a7-43f7-b0a3-1180bea648b3-kube-api-access-xk68w\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.210508 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.214471 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.217013 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.219789 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.233066 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.242847 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280070 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4szlr\" (UniqueName: \"kubernetes.io/projected/8dd71512-2453-4dff-98d8-3cf981fbbb8f-kube-api-access-4szlr\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280137 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280180 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-logs\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280286 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280389 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280422 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280455 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.280656 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.308725 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.381923 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4szlr\" (UniqueName: \"kubernetes.io/projected/8dd71512-2453-4dff-98d8-3cf981fbbb8f-kube-api-access-4szlr\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.381990 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382024 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-logs\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382060 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382092 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382110 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382136 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382219 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382491 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.382900 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.383164 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-logs\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.386331 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.386527 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-config-data\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.389325 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-scripts\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.390320 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.402307 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4szlr\" (UniqueName: \"kubernetes.io/projected/8dd71512-2453-4dff-98d8-3cf981fbbb8f-kube-api-access-4szlr\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.420000 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.617935 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.897463 4765 scope.go:117] "RemoveContainer" containerID="a945b6261e43806009e442d791d0d2ef539275b33aff2fa75923fd4e488057ba" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.934196 4765 scope.go:117] "RemoveContainer" containerID="e0a58e50be5daa44b0442d628ee0cce2884bee4a5b3abf1f741073311e13a34d" Mar 19 10:42:47 crc kubenswrapper[4765]: I0319 10:42:47.985415 4765 scope.go:117] "RemoveContainer" containerID="c93c7a99bca21658e395392407b022c02419266ea29cfa9bf6731a980e641a2b" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.109869 4765 scope.go:117] "RemoveContainer" containerID="371b1829bc0d6f74ef0efb4b0612ae0419ac2477162a39935d459530c1b4c6ba" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.185290 4765 scope.go:117] "RemoveContainer" containerID="5f76ded8fa06729ab4d0d7bd559c713625500321d87cdd2fbaa1e57208c35577" Mar 19 10:42:48 crc kubenswrapper[4765]: E0319 10:42:48.273924 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified" Mar 19 10:42:48 crc kubenswrapper[4765]: E0319 10:42:48.274120 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:cinder-db-sync,Image:quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_set_configs && /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KOLLA_BOOTSTRAP,Value:TRUE,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:etc-machine-id,ReadOnly:true,MountPath:/etc/machine-id,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:true,MountPath:/usr/local/bin/container-scripts,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/config-data/merged,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/etc/my.cnf,SubPath:my.cnf,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:db-sync-config-data,ReadOnly:true,MountPath:/etc/cinder/cinder.conf.d,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:true,MountPath:/var/lib/kolla/config_files/config.json,SubPath:db-sync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-tn9jt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:nil,Privileged:nil,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-db-sync-mbhfs_openstack(21c84c7a-03f0-4ab5-a259-95a351cbdf13): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:42:48 crc kubenswrapper[4765]: E0319 10:42:48.275840 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/cinder-db-sync-mbhfs" podUID="21c84c7a-03f0-4ab5-a259-95a351cbdf13" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.368598 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="10618530-67d6-40d7-94a2-4d956874d442" path="/var/lib/kubelet/pods/10618530-67d6-40d7-94a2-4d956874d442/volumes" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.370513 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="106dc5f6-567f-4876-a58f-ce99f120f564" path="/var/lib/kubelet/pods/106dc5f6-567f-4876-a58f-ce99f120f564/volumes" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.372530 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2790b039-8c4f-4fe6-9b79-c7bf16532cd1" path="/var/lib/kubelet/pods/2790b039-8c4f-4fe6-9b79-c7bf16532cd1/volumes" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.373138 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b9006a2a-0e22-4e7a-ac04-dffa6500b246" path="/var/lib/kubelet/pods/b9006a2a-0e22-4e7a-ac04-dffa6500b246/volumes" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.373608 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" path="/var/lib/kubelet/pods/c66aa28a-5eba-4c64-abec-75b6577131a4/volumes" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.375706 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8a2ac88-8b8a-485c-b952-d89590f71f68" path="/var/lib/kubelet/pods/e8a2ac88-8b8a-485c-b952-d89590f71f68/volumes" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.376231 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-6c6ff5646d-fmdz2"] Mar 19 10:42:48 crc kubenswrapper[4765]: W0319 10:42:48.376978 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb506e362_44bf_4267_bea0_18131aa011fa.slice/crio-01876b6049b2d73b24c9ff67732e3bd411a384b2c15f07ae763ef031b9f62b9d WatchSource:0}: Error finding container 01876b6049b2d73b24c9ff67732e3bd411a384b2c15f07ae763ef031b9f62b9d: Status 404 returned error can't find the container with id 01876b6049b2d73b24c9ff67732e3bd411a384b2c15f07ae763ef031b9f62b9d Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.449747 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/horizon-5c6bdcb6fb-89kxv"] Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.513208 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-bootstrap-j8hqn"] Mar 19 10:42:48 crc kubenswrapper[4765]: W0319 10:42:48.597776 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5112f66b_28fa_4500_b77b_351b8c3d0519.slice/crio-1dab967f1829d7055dc9c1ebc1bbad7a17f5e9bc1937b2222904e094518abd77 WatchSource:0}: Error finding container 1dab967f1829d7055dc9c1ebc1bbad7a17f5e9bc1937b2222904e094518abd77: Status 404 returned error can't find the container with id 1dab967f1829d7055dc9c1ebc1bbad7a17f5e9bc1937b2222904e094518abd77 Mar 19 10:42:48 crc kubenswrapper[4765]: W0319 10:42:48.599473 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2ec7bc5e_c876_4f51_8135_166f8ea45721.slice/crio-dfd23f01ff898db2eba6bba25666241bab5ef138e831f907b435b199965a3ca9 WatchSource:0}: Error finding container dfd23f01ff898db2eba6bba25666241bab5ef138e831f907b435b199965a3ca9: Status 404 returned error can't find the container with id dfd23f01ff898db2eba6bba25666241bab5ef138e831f907b435b199965a3ca9 Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.675364 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"osp-secret" Mar 19 10:42:48 crc kubenswrapper[4765]: W0319 10:42:48.749924 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd71512_2453_4dff_98d8_3cf981fbbb8f.slice/crio-3f490d1e63a9e9ef60bbe99ababdfcaa0de7a8f4eb2e6bc867039f2492ba6c2e WatchSource:0}: Error finding container 3f490d1e63a9e9ef60bbe99ababdfcaa0de7a8f4eb2e6bc867039f2492ba6c2e: Status 404 returned error can't find the container with id 3f490d1e63a9e9ef60bbe99ababdfcaa0de7a8f4eb2e6bc867039f2492ba6c2e Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.753011 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.904234 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6bdcb6fb-89kxv" event={"ID":"5112f66b-28fa-4500-b77b-351b8c3d0519","Type":"ContainerStarted","Data":"1dab967f1829d7055dc9c1ebc1bbad7a17f5e9bc1937b2222904e094518abd77"} Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.909056 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6ff5646d-fmdz2" event={"ID":"b506e362-44bf-4267-bea0-18131aa011fa","Type":"ContainerStarted","Data":"01876b6049b2d73b24c9ff67732e3bd411a384b2c15f07ae763ef031b9f62b9d"} Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.912594 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dd71512-2453-4dff-98d8-3cf981fbbb8f","Type":"ContainerStarted","Data":"3f490d1e63a9e9ef60bbe99ababdfcaa0de7a8f4eb2e6bc867039f2492ba6c2e"} Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.916367 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2fr9" event={"ID":"ed90710f-8437-4621-b01e-a78cb4f0a96c","Type":"ContainerStarted","Data":"7521b214959d7571a4688a00f1f6cdd063309f03e9c54ad589a21fd2be254f9b"} Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.922401 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerStarted","Data":"b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e"} Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.924890 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j8hqn" event={"ID":"2ec7bc5e-c876-4f51-8135-166f8ea45721","Type":"ContainerStarted","Data":"dfd23f01ff898db2eba6bba25666241bab5ef138e831f907b435b199965a3ca9"} Mar 19 10:42:48 crc kubenswrapper[4765]: E0319 10:42:48.928487 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"cinder-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-cinder-api:current-podified\\\"\"" pod="openstack/cinder-db-sync-mbhfs" podUID="21c84c7a-03f0-4ab5-a259-95a351cbdf13" Mar 19 10:42:48 crc kubenswrapper[4765]: I0319 10:42:48.937108 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-db-sync-q2fr9" podStartSLOduration=4.5280599 podStartE2EDuration="33.937087321s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="2026-03-19 10:42:16.612536844 +0000 UTC m=+1234.961482386" lastFinishedPulling="2026-03-19 10:42:46.021564265 +0000 UTC m=+1264.370509807" observedRunningTime="2026-03-19 10:42:48.933562385 +0000 UTC m=+1267.282507927" watchObservedRunningTime="2026-03-19 10:42:48.937087321 +0000 UTC m=+1267.286032853" Mar 19 10:42:49 crc kubenswrapper[4765]: I0319 10:42:49.629009 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:42:49 crc kubenswrapper[4765]: I0319 10:42:49.939045 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cczjt" event={"ID":"e4012221-7e3d-4ee8-9c90-e564931f5a30","Type":"ContainerStarted","Data":"594824d6c73c3246a8884eeefdc9303559e3ecafa98a05f9f83de087157e8f7c"} Mar 19 10:42:49 crc kubenswrapper[4765]: I0319 10:42:49.944181 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dd71512-2453-4dff-98d8-3cf981fbbb8f","Type":"ContainerStarted","Data":"b6ebe68d3a95bc401568b3ba8caa6bf11c12bccbcf18901317915931dd3c8093"} Mar 19 10:42:49 crc kubenswrapper[4765]: I0319 10:42:49.951457 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a083bcfd-87a7-43f7-b0a3-1180bea648b3","Type":"ContainerStarted","Data":"6758a99c044d1524a047506a22b3b0dc8a85c0c3e416de94b3f07f50e8f83a3f"} Mar 19 10:42:49 crc kubenswrapper[4765]: I0319 10:42:49.954639 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j8hqn" event={"ID":"2ec7bc5e-c876-4f51-8135-166f8ea45721","Type":"ContainerStarted","Data":"a752eed62ec4bf77939b90fc56b5260c1b4f2eac5088987b4dba2288b1f30345"} Mar 19 10:42:49 crc kubenswrapper[4765]: I0319 10:42:49.966031 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-db-sync-cczjt" podStartSLOduration=3.79442814 podStartE2EDuration="34.966010117s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="2026-03-19 10:42:17.502041328 +0000 UTC m=+1235.850986870" lastFinishedPulling="2026-03-19 10:42:48.673623305 +0000 UTC m=+1267.022568847" observedRunningTime="2026-03-19 10:42:49.964120596 +0000 UTC m=+1268.313066138" watchObservedRunningTime="2026-03-19 10:42:49.966010117 +0000 UTC m=+1268.314955659" Mar 19 10:42:49 crc kubenswrapper[4765]: I0319 10:42:49.991540 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-bootstrap-j8hqn" podStartSLOduration=10.991517359 podStartE2EDuration="10.991517359s" podCreationTimestamp="2026-03-19 10:42:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:49.98344483 +0000 UTC m=+1268.332390372" watchObservedRunningTime="2026-03-19 10:42:49.991517359 +0000 UTC m=+1268.340462901" Mar 19 10:42:51 crc kubenswrapper[4765]: I0319 10:42:51.002571 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6bdcb6fb-89kxv" event={"ID":"5112f66b-28fa-4500-b77b-351b8c3d0519","Type":"ContainerStarted","Data":"9f412968986b7556b9d0cd9de4886ebe503ad2f5c2b7c5677168459667cc0902"} Mar 19 10:42:51 crc kubenswrapper[4765]: I0319 10:42:51.004791 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6ff5646d-fmdz2" event={"ID":"b506e362-44bf-4267-bea0-18131aa011fa","Type":"ContainerStarted","Data":"2c3331c089f8c7782f9a7d8827a9db3214c66b45628fa1bc122e6c8bb08231ad"} Mar 19 10:42:51 crc kubenswrapper[4765]: I0319 10:42:51.006316 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dd71512-2453-4dff-98d8-3cf981fbbb8f","Type":"ContainerStarted","Data":"ecd8b3390445b80ae72265c157bbdc070db369ac9b9f3af9d421ae1b82ddbdb2"} Mar 19 10:42:51 crc kubenswrapper[4765]: I0319 10:42:51.019553 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a083bcfd-87a7-43f7-b0a3-1180bea648b3","Type":"ContainerStarted","Data":"a2c13d1a9d7fc306c80cb91654ba453ad5b1dc4a52e34f7c0e29a95aed9cdd2d"} Mar 19 10:42:51 crc kubenswrapper[4765]: I0319 10:42:51.070109 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-5f59b8f679-fwp2t" podUID="c66aa28a-5eba-4c64-abec-75b6577131a4" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.130:5353: i/o timeout" Mar 19 10:42:51 crc kubenswrapper[4765]: I0319 10:42:51.079464 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=4.079441926 podStartE2EDuration="4.079441926s" podCreationTimestamp="2026-03-19 10:42:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:51.056563606 +0000 UTC m=+1269.405509158" watchObservedRunningTime="2026-03-19 10:42:51.079441926 +0000 UTC m=+1269.428387468" Mar 19 10:42:52 crc kubenswrapper[4765]: I0319 10:42:52.043265 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerStarted","Data":"4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210"} Mar 19 10:42:52 crc kubenswrapper[4765]: I0319 10:42:52.047279 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-6c6ff5646d-fmdz2" event={"ID":"b506e362-44bf-4267-bea0-18131aa011fa","Type":"ContainerStarted","Data":"94823c36f8045bdd6bf6b4883f73a0acc2b688e13ac1a95d49fafcadcdfa027b"} Mar 19 10:42:52 crc kubenswrapper[4765]: I0319 10:42:52.051736 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a083bcfd-87a7-43f7-b0a3-1180bea648b3","Type":"ContainerStarted","Data":"21835277e1595cab353343771c2d49e7872ba78270008da1c8865836e3c549b7"} Mar 19 10:42:52 crc kubenswrapper[4765]: I0319 10:42:52.059284 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6bdcb6fb-89kxv" event={"ID":"5112f66b-28fa-4500-b77b-351b8c3d0519","Type":"ContainerStarted","Data":"1a7ad5eca76b21850fa11fd220a31f1fb2463a805be4f0068170a81bfea7d086"} Mar 19 10:42:52 crc kubenswrapper[4765]: I0319 10:42:52.078746 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-6c6ff5646d-fmdz2" podStartSLOduration=26.492437226 podStartE2EDuration="28.078724469s" podCreationTimestamp="2026-03-19 10:42:24 +0000 UTC" firstStartedPulling="2026-03-19 10:42:48.384083642 +0000 UTC m=+1266.733029184" lastFinishedPulling="2026-03-19 10:42:49.970370885 +0000 UTC m=+1268.319316427" observedRunningTime="2026-03-19 10:42:52.073301612 +0000 UTC m=+1270.422247154" watchObservedRunningTime="2026-03-19 10:42:52.078724469 +0000 UTC m=+1270.427670011" Mar 19 10:42:52 crc kubenswrapper[4765]: I0319 10:42:52.101082 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/horizon-5c6bdcb6fb-89kxv" podStartSLOduration=26.772965605 podStartE2EDuration="28.101053655s" podCreationTimestamp="2026-03-19 10:42:24 +0000 UTC" firstStartedPulling="2026-03-19 10:42:48.671191639 +0000 UTC m=+1267.020137181" lastFinishedPulling="2026-03-19 10:42:49.999279689 +0000 UTC m=+1268.348225231" observedRunningTime="2026-03-19 10:42:52.094038114 +0000 UTC m=+1270.442983666" watchObservedRunningTime="2026-03-19 10:42:52.101053655 +0000 UTC m=+1270.449999197" Mar 19 10:42:52 crc kubenswrapper[4765]: I0319 10:42:52.156168 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=6.156148289 podStartE2EDuration="6.156148289s" podCreationTimestamp="2026-03-19 10:42:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:42:52.122511836 +0000 UTC m=+1270.471457378" watchObservedRunningTime="2026-03-19 10:42:52.156148289 +0000 UTC m=+1270.505093831" Mar 19 10:42:54 crc kubenswrapper[4765]: I0319 10:42:54.091097 4765 generic.go:334] "Generic (PLEG): container finished" podID="2ec7bc5e-c876-4f51-8135-166f8ea45721" containerID="a752eed62ec4bf77939b90fc56b5260c1b4f2eac5088987b4dba2288b1f30345" exitCode=0 Mar 19 10:42:54 crc kubenswrapper[4765]: I0319 10:42:54.091176 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j8hqn" event={"ID":"2ec7bc5e-c876-4f51-8135-166f8ea45721","Type":"ContainerDied","Data":"a752eed62ec4bf77939b90fc56b5260c1b4f2eac5088987b4dba2288b1f30345"} Mar 19 10:42:55 crc kubenswrapper[4765]: I0319 10:42:55.071363 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:55 crc kubenswrapper[4765]: I0319 10:42:55.071859 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:42:55 crc kubenswrapper[4765]: I0319 10:42:55.104749 4765 generic.go:334] "Generic (PLEG): container finished" podID="ed90710f-8437-4621-b01e-a78cb4f0a96c" containerID="7521b214959d7571a4688a00f1f6cdd063309f03e9c54ad589a21fd2be254f9b" exitCode=0 Mar 19 10:42:55 crc kubenswrapper[4765]: I0319 10:42:55.104833 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2fr9" event={"ID":"ed90710f-8437-4621-b01e-a78cb4f0a96c","Type":"ContainerDied","Data":"7521b214959d7571a4688a00f1f6cdd063309f03e9c54ad589a21fd2be254f9b"} Mar 19 10:42:55 crc kubenswrapper[4765]: I0319 10:42:55.231841 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:55 crc kubenswrapper[4765]: I0319 10:42:55.232086 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:42:55 crc kubenswrapper[4765]: I0319 10:42:55.350893 4765 scope.go:117] "RemoveContainer" containerID="2a25fb65b9413294faba5a046fe314e6432042e0f75e502bbf896a2043d09937" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.344099 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.394927 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-credential-keys\") pod \"2ec7bc5e-c876-4f51-8135-166f8ea45721\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.395040 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2d82\" (UniqueName: \"kubernetes.io/projected/2ec7bc5e-c876-4f51-8135-166f8ea45721-kube-api-access-s2d82\") pod \"2ec7bc5e-c876-4f51-8135-166f8ea45721\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.395080 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-scripts\") pod \"2ec7bc5e-c876-4f51-8135-166f8ea45721\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.395186 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-config-data\") pod \"2ec7bc5e-c876-4f51-8135-166f8ea45721\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.395219 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-fernet-keys\") pod \"2ec7bc5e-c876-4f51-8135-166f8ea45721\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.395251 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-combined-ca-bundle\") pod \"2ec7bc5e-c876-4f51-8135-166f8ea45721\" (UID: \"2ec7bc5e-c876-4f51-8135-166f8ea45721\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.403607 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "2ec7bc5e-c876-4f51-8135-166f8ea45721" (UID: "2ec7bc5e-c876-4f51-8135-166f8ea45721"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.407948 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-scripts" (OuterVolumeSpecName: "scripts") pod "2ec7bc5e-c876-4f51-8135-166f8ea45721" (UID: "2ec7bc5e-c876-4f51-8135-166f8ea45721"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.416362 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ec7bc5e-c876-4f51-8135-166f8ea45721-kube-api-access-s2d82" (OuterVolumeSpecName: "kube-api-access-s2d82") pod "2ec7bc5e-c876-4f51-8135-166f8ea45721" (UID: "2ec7bc5e-c876-4f51-8135-166f8ea45721"). InnerVolumeSpecName "kube-api-access-s2d82". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.427677 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "2ec7bc5e-c876-4f51-8135-166f8ea45721" (UID: "2ec7bc5e-c876-4f51-8135-166f8ea45721"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.428304 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2ec7bc5e-c876-4f51-8135-166f8ea45721" (UID: "2ec7bc5e-c876-4f51-8135-166f8ea45721"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.433139 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-config-data" (OuterVolumeSpecName: "config-data") pod "2ec7bc5e-c876-4f51-8135-166f8ea45721" (UID: "2ec7bc5e-c876-4f51-8135-166f8ea45721"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.497680 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.497927 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.498062 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.498115 4765 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-credential-keys\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.498185 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2d82\" (UniqueName: \"kubernetes.io/projected/2ec7bc5e-c876-4f51-8135-166f8ea45721-kube-api-access-s2d82\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.498238 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2ec7bc5e-c876-4f51-8135-166f8ea45721-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.545082 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.599333 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data\") pod \"ed90710f-8437-4621-b01e-a78cb4f0a96c\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.599628 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed90710f-8437-4621-b01e-a78cb4f0a96c-logs\") pod \"ed90710f-8437-4621-b01e-a78cb4f0a96c\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.599899 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dx7g9\" (UniqueName: \"kubernetes.io/projected/ed90710f-8437-4621-b01e-a78cb4f0a96c-kube-api-access-dx7g9\") pod \"ed90710f-8437-4621-b01e-a78cb4f0a96c\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.600093 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-scripts\") pod \"ed90710f-8437-4621-b01e-a78cb4f0a96c\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.600199 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-combined-ca-bundle\") pod \"ed90710f-8437-4621-b01e-a78cb4f0a96c\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.602590 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed90710f-8437-4621-b01e-a78cb4f0a96c-logs" (OuterVolumeSpecName: "logs") pod "ed90710f-8437-4621-b01e-a78cb4f0a96c" (UID: "ed90710f-8437-4621-b01e-a78cb4f0a96c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.604943 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed90710f-8437-4621-b01e-a78cb4f0a96c-kube-api-access-dx7g9" (OuterVolumeSpecName: "kube-api-access-dx7g9") pod "ed90710f-8437-4621-b01e-a78cb4f0a96c" (UID: "ed90710f-8437-4621-b01e-a78cb4f0a96c"). InnerVolumeSpecName "kube-api-access-dx7g9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.609429 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-scripts" (OuterVolumeSpecName: "scripts") pod "ed90710f-8437-4621-b01e-a78cb4f0a96c" (UID: "ed90710f-8437-4621-b01e-a78cb4f0a96c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: E0319 10:42:56.625694 4765 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data podName:ed90710f-8437-4621-b01e-a78cb4f0a96c nodeName:}" failed. No retries permitted until 2026-03-19 10:42:57.12566406 +0000 UTC m=+1275.474609602 (durationBeforeRetry 500ms). Error: error cleaning subPath mounts for volume "config-data" (UniqueName: "kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data") pod "ed90710f-8437-4621-b01e-a78cb4f0a96c" (UID: "ed90710f-8437-4621-b01e-a78cb4f0a96c") : error deleting /var/lib/kubelet/pods/ed90710f-8437-4621-b01e-a78cb4f0a96c/volume-subpaths: remove /var/lib/kubelet/pods/ed90710f-8437-4621-b01e-a78cb4f0a96c/volume-subpaths: no such file or directory Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.629590 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ed90710f-8437-4621-b01e-a78cb4f0a96c" (UID: "ed90710f-8437-4621-b01e-a78cb4f0a96c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.704838 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/ed90710f-8437-4621-b01e-a78cb4f0a96c-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.704885 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dx7g9\" (UniqueName: \"kubernetes.io/projected/ed90710f-8437-4621-b01e-a78cb4f0a96c-kube-api-access-dx7g9\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.704898 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:56 crc kubenswrapper[4765]: I0319 10:42:56.704909 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.126950 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-db-sync-q2fr9" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.127196 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-db-sync-q2fr9" event={"ID":"ed90710f-8437-4621-b01e-a78cb4f0a96c","Type":"ContainerDied","Data":"c05b9477bcc89ed9ff849704d2af14832e002e4970ae81d969d88cf36882e00c"} Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.127725 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c05b9477bcc89ed9ff849704d2af14832e002e4970ae81d969d88cf36882e00c" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.132056 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-bootstrap-j8hqn" event={"ID":"2ec7bc5e-c876-4f51-8135-166f8ea45721","Type":"ContainerDied","Data":"dfd23f01ff898db2eba6bba25666241bab5ef138e831f907b435b199965a3ca9"} Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.132097 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dfd23f01ff898db2eba6bba25666241bab5ef138e831f907b435b199965a3ca9" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.132163 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-bootstrap-j8hqn" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.214436 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data\") pod \"ed90710f-8437-4621-b01e-a78cb4f0a96c\" (UID: \"ed90710f-8437-4621-b01e-a78cb4f0a96c\") " Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.221076 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data" (OuterVolumeSpecName: "config-data") pod "ed90710f-8437-4621-b01e-a78cb4f0a96c" (UID: "ed90710f-8437-4621-b01e-a78cb4f0a96c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.229062 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-78fbf58cd4-88cvd"] Mar 19 10:42:57 crc kubenswrapper[4765]: E0319 10:42:57.229537 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2ec7bc5e-c876-4f51-8135-166f8ea45721" containerName="keystone-bootstrap" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.229617 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ec7bc5e-c876-4f51-8135-166f8ea45721" containerName="keystone-bootstrap" Mar 19 10:42:57 crc kubenswrapper[4765]: E0319 10:42:57.229698 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed90710f-8437-4621-b01e-a78cb4f0a96c" containerName="placement-db-sync" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.229759 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed90710f-8437-4621-b01e-a78cb4f0a96c" containerName="placement-db-sync" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.230045 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed90710f-8437-4621-b01e-a78cb4f0a96c" containerName="placement-db-sync" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.230132 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2ec7bc5e-c876-4f51-8135-166f8ea45721" containerName="keystone-bootstrap" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.231038 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.238483 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-public-svc" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.238735 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-placement-internal-svc" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.263722 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78fbf58cd4-88cvd"] Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.310313 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.310374 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.315980 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-combined-ca-bundle\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.316039 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-internal-tls-certs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.316088 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-public-tls-certs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.316126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-config-data\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.316242 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf4704-916f-4b97-804c-c64d00158bc5-logs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.316312 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-scripts\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.316385 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s285x\" (UniqueName: \"kubernetes.io/projected/83bf4704-916f-4b97-804c-c64d00158bc5-kube-api-access-s285x\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.316463 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ed90710f-8437-4621-b01e-a78cb4f0a96c-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.355376 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.364895 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.418447 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s285x\" (UniqueName: \"kubernetes.io/projected/83bf4704-916f-4b97-804c-c64d00158bc5-kube-api-access-s285x\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.418508 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-combined-ca-bundle\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.418549 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-internal-tls-certs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.418592 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-public-tls-certs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.418629 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-config-data\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.418664 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf4704-916f-4b97-804c-c64d00158bc5-logs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.418704 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-scripts\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.419793 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf4704-916f-4b97-804c-c64d00158bc5-logs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.424346 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-scripts\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.424511 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-internal-tls-certs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.426489 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-config-data\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.427418 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-public-tls-certs\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.427456 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-combined-ca-bundle\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.440872 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s285x\" (UniqueName: \"kubernetes.io/projected/83bf4704-916f-4b97-804c-c64d00158bc5-kube-api-access-s285x\") pod \"placement-78fbf58cd4-88cvd\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.536884 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-79688b6ffc-lc92w"] Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.538887 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.543182 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.543562 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-scripts" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.543862 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-config-data" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.544565 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-internal-svc" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.544743 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-keystone-public-svc" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.544935 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"keystone-keystone-dockercfg-zvnkb" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.576071 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79688b6ffc-lc92w"] Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.602438 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.619193 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.619242 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.625807 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-fernet-keys\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.625863 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ww9nv\" (UniqueName: \"kubernetes.io/projected/c59a3da3-7154-4531-9bf8-96771979b410-kube-api-access-ww9nv\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.625888 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-config-data\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.625913 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-credential-keys\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.626027 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-public-tls-certs\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.626084 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-internal-tls-certs\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.626117 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-combined-ca-bundle\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.626138 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-scripts\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.664450 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.672292 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.727870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-public-tls-certs\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.728014 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-internal-tls-certs\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.728080 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-combined-ca-bundle\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.728106 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-scripts\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.728190 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ww9nv\" (UniqueName: \"kubernetes.io/projected/c59a3da3-7154-4531-9bf8-96771979b410-kube-api-access-ww9nv\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.728212 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-fernet-keys\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.728237 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-config-data\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.728270 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-credential-keys\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.733068 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-internal-tls-certs\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.734082 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-fernet-keys\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.737436 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-credential-keys\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.740944 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-scripts\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.741458 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-public-tls-certs\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.748151 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-config-data\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.760239 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ww9nv\" (UniqueName: \"kubernetes.io/projected/c59a3da3-7154-4531-9bf8-96771979b410-kube-api-access-ww9nv\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.768098 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/c59a3da3-7154-4531-9bf8-96771979b410-combined-ca-bundle\") pod \"keystone-79688b6ffc-lc92w\" (UID: \"c59a3da3-7154-4531-9bf8-96771979b410\") " pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:57 crc kubenswrapper[4765]: I0319 10:42:57.864674 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:42:58 crc kubenswrapper[4765]: I0319 10:42:58.115509 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-78fbf58cd4-88cvd"] Mar 19 10:42:58 crc kubenswrapper[4765]: W0319 10:42:58.120439 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83bf4704_916f_4b97_804c_c64d00158bc5.slice/crio-35d100727d6206ae2b837126fdec2f26519ef810c93fd624c96860bbaf688e7c WatchSource:0}: Error finding container 35d100727d6206ae2b837126fdec2f26519ef810c93fd624c96860bbaf688e7c: Status 404 returned error can't find the container with id 35d100727d6206ae2b837126fdec2f26519ef810c93fd624c96860bbaf688e7c Mar 19 10:42:58 crc kubenswrapper[4765]: I0319 10:42:58.153773 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78fbf58cd4-88cvd" event={"ID":"83bf4704-916f-4b97-804c-c64d00158bc5","Type":"ContainerStarted","Data":"35d100727d6206ae2b837126fdec2f26519ef810c93fd624c96860bbaf688e7c"} Mar 19 10:42:58 crc kubenswrapper[4765]: I0319 10:42:58.155005 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 10:42:58 crc kubenswrapper[4765]: I0319 10:42:58.155117 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 10:42:58 crc kubenswrapper[4765]: I0319 10:42:58.155177 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:58 crc kubenswrapper[4765]: I0319 10:42:58.155273 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 10:42:58 crc kubenswrapper[4765]: I0319 10:42:58.392283 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-79688b6ffc-lc92w"] Mar 19 10:42:58 crc kubenswrapper[4765]: W0319 10:42:58.399795 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc59a3da3_7154_4531_9bf8_96771979b410.slice/crio-3844b558595cbab4b5a48a0f8f56d2461dc334882cac2433b5330d6edeffb945 WatchSource:0}: Error finding container 3844b558595cbab4b5a48a0f8f56d2461dc334882cac2433b5330d6edeffb945: Status 404 returned error can't find the container with id 3844b558595cbab4b5a48a0f8f56d2461dc334882cac2433b5330d6edeffb945 Mar 19 10:42:59 crc kubenswrapper[4765]: I0319 10:42:59.172425 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79688b6ffc-lc92w" event={"ID":"c59a3da3-7154-4531-9bf8-96771979b410","Type":"ContainerStarted","Data":"3844b558595cbab4b5a48a0f8f56d2461dc334882cac2433b5330d6edeffb945"} Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.184460 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.186365 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.294123 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.294211 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.516245 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.629580 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.649775 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.966733 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/placement-54bc4cb6bd-w8bvw"] Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.968637 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:00 crc kubenswrapper[4765]: I0319 10:43:00.987591 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54bc4cb6bd-w8bvw"] Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.101767 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-logs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.101853 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-internal-tls-certs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.101976 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-combined-ca-bundle\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.102097 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-scripts\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.102192 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-public-tls-certs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.102233 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-config-data\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.102277 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6dn8\" (UniqueName: \"kubernetes.io/projected/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-kube-api-access-x6dn8\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.204692 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-logs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.204791 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-internal-tls-certs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.204838 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-combined-ca-bundle\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.204881 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-scripts\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.204929 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-public-tls-certs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.204952 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-config-data\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.204996 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6dn8\" (UniqueName: \"kubernetes.io/projected/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-kube-api-access-x6dn8\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.206467 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-logs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.212847 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-config-data\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.213465 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-internal-tls-certs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.214455 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-combined-ca-bundle\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.217869 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-scripts\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.218462 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-public-tls-certs\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.222491 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6dn8\" (UniqueName: \"kubernetes.io/projected/21f8be56-b9b5-4205-8de4-dd4d204b9f3d-kube-api-access-x6dn8\") pod \"placement-54bc4cb6bd-w8bvw\" (UID: \"21f8be56-b9b5-4205-8de4-dd4d204b9f3d\") " pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.301908 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.656293 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.656933 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.657020 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.659642 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f15a48ff831f92a999a822adb51bf1e4ef1ab9b4cad221adcbd0787b32c65b85"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.659732 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://f15a48ff831f92a999a822adb51bf1e4ef1ab9b4cad221adcbd0787b32c65b85" gracePeriod=600 Mar 19 10:43:01 crc kubenswrapper[4765]: I0319 10:43:01.691576 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/placement-54bc4cb6bd-w8bvw"] Mar 19 10:43:02 crc kubenswrapper[4765]: I0319 10:43:02.203628 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54bc4cb6bd-w8bvw" event={"ID":"21f8be56-b9b5-4205-8de4-dd4d204b9f3d","Type":"ContainerStarted","Data":"bb5deb8198f6d0f759d7fe0ba674b9258d924dc3ef1d98ed6e5fe541ff3e47bc"} Mar 19 10:43:02 crc kubenswrapper[4765]: I0319 10:43:02.207022 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="f15a48ff831f92a999a822adb51bf1e4ef1ab9b4cad221adcbd0787b32c65b85" exitCode=0 Mar 19 10:43:02 crc kubenswrapper[4765]: I0319 10:43:02.207074 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"f15a48ff831f92a999a822adb51bf1e4ef1ab9b4cad221adcbd0787b32c65b85"} Mar 19 10:43:02 crc kubenswrapper[4765]: I0319 10:43:02.207493 4765 scope.go:117] "RemoveContainer" containerID="74ef4f9e7cb23afbd3cc2c57d6c7b62007d3fc20daf2ec79338ae2ff820f9dfb" Mar 19 10:43:03 crc kubenswrapper[4765]: I0319 10:43:03.227996 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"7fbbabc77237677f702271306a25be40ef78a15b44ac1218092fa412c82ce0c1"} Mar 19 10:43:03 crc kubenswrapper[4765]: I0319 10:43:03.229493 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54bc4cb6bd-w8bvw" event={"ID":"21f8be56-b9b5-4205-8de4-dd4d204b9f3d","Type":"ContainerStarted","Data":"dc12869b9ddb1e584d430d6403bc24d55ad3f3bc685869b405351011afb28121"} Mar 19 10:43:04 crc kubenswrapper[4765]: I0319 10:43:04.249204 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78fbf58cd4-88cvd" event={"ID":"83bf4704-916f-4b97-804c-c64d00158bc5","Type":"ContainerStarted","Data":"6c2171aeedc98b01a345cc71afeb8471981c231139f7a7849d287fef6f6b3565"} Mar 19 10:43:04 crc kubenswrapper[4765]: I0319 10:43:04.251343 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-79688b6ffc-lc92w" event={"ID":"c59a3da3-7154-4531-9bf8-96771979b410","Type":"ContainerStarted","Data":"0906d041de5d695dab18208500655f6598af38d342dd505260eb9e6085464b68"} Mar 19 10:43:04 crc kubenswrapper[4765]: I0319 10:43:04.251479 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:43:04 crc kubenswrapper[4765]: I0319 10:43:04.259685 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-54bc4cb6bd-w8bvw" event={"ID":"21f8be56-b9b5-4205-8de4-dd4d204b9f3d","Type":"ContainerStarted","Data":"9e2d12a4f9d868116b36be844b696efe4226371e4f7b4fbf6118d2681396913a"} Mar 19 10:43:04 crc kubenswrapper[4765]: I0319 10:43:04.288307 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-79688b6ffc-lc92w" podStartSLOduration=7.288273105 podStartE2EDuration="7.288273105s" podCreationTimestamp="2026-03-19 10:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:04.270215566 +0000 UTC m=+1282.619161118" watchObservedRunningTime="2026-03-19 10:43:04.288273105 +0000 UTC m=+1282.637218647" Mar 19 10:43:04 crc kubenswrapper[4765]: I0319 10:43:04.317540 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-54bc4cb6bd-w8bvw" podStartSLOduration=4.317504538 podStartE2EDuration="4.317504538s" podCreationTimestamp="2026-03-19 10:43:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:04.307767044 +0000 UTC m=+1282.656712586" watchObservedRunningTime="2026-03-19 10:43:04.317504538 +0000 UTC m=+1282.666450080" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.082542 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-5c6bdcb6fb-89kxv" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.238507 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c6ff5646d-fmdz2" podUID="b506e362-44bf-4267-bea0-18131aa011fa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.270716 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerStarted","Data":"c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c"} Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.274185 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mbhfs" event={"ID":"21c84c7a-03f0-4ab5-a259-95a351cbdf13","Type":"ContainerStarted","Data":"eb368f1410dc53f6a7bcd7b26fa29ac3838dee4fc9d92e44e8e090d0b483cfce"} Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.277212 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4012221-7e3d-4ee8-9c90-e564931f5a30" containerID="594824d6c73c3246a8884eeefdc9303559e3ecafa98a05f9f83de087157e8f7c" exitCode=0 Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.277283 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cczjt" event={"ID":"e4012221-7e3d-4ee8-9c90-e564931f5a30","Type":"ContainerDied","Data":"594824d6c73c3246a8884eeefdc9303559e3ecafa98a05f9f83de087157e8f7c"} Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.280071 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78fbf58cd4-88cvd" event={"ID":"83bf4704-916f-4b97-804c-c64d00158bc5","Type":"ContainerStarted","Data":"009916405b79f186f13e7105fc31b9e99265153221baab7a5bbf67e6ce9b6cd6"} Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.280342 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.280545 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.280566 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.280578 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.303105 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-db-sync-mbhfs" podStartSLOduration=2.463462033 podStartE2EDuration="50.303087809s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="2026-03-19 10:42:16.640267576 +0000 UTC m=+1234.989213118" lastFinishedPulling="2026-03-19 10:43:04.479893352 +0000 UTC m=+1282.828838894" observedRunningTime="2026-03-19 10:43:05.301218139 +0000 UTC m=+1283.650163681" watchObservedRunningTime="2026-03-19 10:43:05.303087809 +0000 UTC m=+1283.652033351" Mar 19 10:43:05 crc kubenswrapper[4765]: I0319 10:43:05.322263 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/placement-78fbf58cd4-88cvd" podStartSLOduration=8.322242579 podStartE2EDuration="8.322242579s" podCreationTimestamp="2026-03-19 10:42:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:05.322096455 +0000 UTC m=+1283.671042017" watchObservedRunningTime="2026-03-19 10:43:05.322242579 +0000 UTC m=+1283.671188121" Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.692165 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cczjt" Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.845786 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wds8\" (UniqueName: \"kubernetes.io/projected/e4012221-7e3d-4ee8-9c90-e564931f5a30-kube-api-access-7wds8\") pod \"e4012221-7e3d-4ee8-9c90-e564931f5a30\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.845911 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-combined-ca-bundle\") pod \"e4012221-7e3d-4ee8-9c90-e564931f5a30\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.846127 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-db-sync-config-data\") pod \"e4012221-7e3d-4ee8-9c90-e564931f5a30\" (UID: \"e4012221-7e3d-4ee8-9c90-e564931f5a30\") " Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.875360 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "e4012221-7e3d-4ee8-9c90-e564931f5a30" (UID: "e4012221-7e3d-4ee8-9c90-e564931f5a30"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.876425 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4012221-7e3d-4ee8-9c90-e564931f5a30-kube-api-access-7wds8" (OuterVolumeSpecName: "kube-api-access-7wds8") pod "e4012221-7e3d-4ee8-9c90-e564931f5a30" (UID: "e4012221-7e3d-4ee8-9c90-e564931f5a30"). InnerVolumeSpecName "kube-api-access-7wds8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.882713 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e4012221-7e3d-4ee8-9c90-e564931f5a30" (UID: "e4012221-7e3d-4ee8-9c90-e564931f5a30"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.948825 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.948866 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/e4012221-7e3d-4ee8-9c90-e564931f5a30-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:06 crc kubenswrapper[4765]: I0319 10:43:06.948876 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wds8\" (UniqueName: \"kubernetes.io/projected/e4012221-7e3d-4ee8-9c90-e564931f5a30-kube-api-access-7wds8\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.297876 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-db-sync-cczjt" event={"ID":"e4012221-7e3d-4ee8-9c90-e564931f5a30","Type":"ContainerDied","Data":"649c5d1dbced6925beea6beac3206676ae8b823fcfe7ca4f4fefd392137720f1"} Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.298087 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="649c5d1dbced6925beea6beac3206676ae8b823fcfe7ca4f4fefd392137720f1" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.297911 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-db-sync-cczjt" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.299473 4765 generic.go:334] "Generic (PLEG): container finished" podID="fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" containerID="a0e7bd5f7d8c30648a07ded5e04182fb4b1b36a99584c14b6806d67eba09527b" exitCode=0 Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.299505 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjjh8" event={"ID":"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e","Type":"ContainerDied","Data":"a0e7bd5f7d8c30648a07ded5e04182fb4b1b36a99584c14b6806d67eba09527b"} Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.626599 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-worker-7f975545dc-7gv92"] Mar 19 10:43:07 crc kubenswrapper[4765]: E0319 10:43:07.632366 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4012221-7e3d-4ee8-9c90-e564931f5a30" containerName="barbican-db-sync" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.632407 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4012221-7e3d-4ee8-9c90-e564931f5a30" containerName="barbican-db-sync" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.632783 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4012221-7e3d-4ee8-9c90-e564931f5a30" containerName="barbican-db-sync" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.633725 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.643057 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-barbican-dockercfg-f4tjw" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.644037 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-worker-config-data" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.644200 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-config-data" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.675747 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f975545dc-7gv92"] Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.720915 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-keystone-listener-6955bd84cd-t7qkv"] Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.722888 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.732097 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-keystone-listener-config-data" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.738198 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6955bd84cd-t7qkv"] Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.762212 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-config-data\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.762333 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-config-data-custom\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.762414 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18f0688-cbc1-49ae-a721-b964e45cc1ea-logs\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.762433 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-combined-ca-bundle\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.762461 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rtqw\" (UniqueName: \"kubernetes.io/projected/b18f0688-cbc1-49ae-a721-b964e45cc1ea-kube-api-access-9rtqw\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.762545 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-sxpx9"] Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.768085 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.775843 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-sxpx9"] Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.856667 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-74c69ffcbd-fntb4"] Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.860289 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865523 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865585 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-combined-ca-bundle\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865633 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865677 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18f0688-cbc1-49ae-a721-b964e45cc1ea-logs\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865703 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-combined-ca-bundle\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865741 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rtqw\" (UniqueName: \"kubernetes.io/projected/b18f0688-cbc1-49ae-a721-b964e45cc1ea-kube-api-access-9rtqw\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865798 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-config-data\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865840 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-config-data\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.865871 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbvwn\" (UniqueName: \"kubernetes.io/projected/414bf196-06ed-4a99-99b0-493cac763e5e-kube-api-access-mbvwn\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.866438 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.866508 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mjprv\" (UniqueName: \"kubernetes.io/projected/65d1a29f-39b3-40d7-9db2-246fc05348cc-kube-api-access-mjprv\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.866676 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"barbican-api-config-data" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.867913 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b18f0688-cbc1-49ae-a721-b964e45cc1ea-logs\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.869366 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.869476 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-config\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.869514 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-config-data-custom\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.869541 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-config-data-custom\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.869566 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d1a29f-39b3-40d7-9db2-246fc05348cc-logs\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.874473 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74c69ffcbd-fntb4"] Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.876524 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-combined-ca-bundle\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.880455 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-config-data\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.896997 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rtqw\" (UniqueName: \"kubernetes.io/projected/b18f0688-cbc1-49ae-a721-b964e45cc1ea-kube-api-access-9rtqw\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.904219 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/b18f0688-cbc1-49ae-a721-b964e45cc1ea-config-data-custom\") pod \"barbican-worker-7f975545dc-7gv92\" (UID: \"b18f0688-cbc1-49ae-a721-b964e45cc1ea\") " pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.972454 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.972560 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-config\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.972603 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkg5z\" (UniqueName: \"kubernetes.io/projected/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-kube-api-access-dkg5z\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.972644 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-config-data-custom\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.972676 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d1a29f-39b3-40d7-9db2-246fc05348cc-logs\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.972707 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.973625 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65d1a29f-39b3-40d7-9db2-246fc05348cc-logs\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.973672 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-nb\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.973932 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-config\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.974139 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-worker-7f975545dc-7gv92" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.974690 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-sb\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.973189 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-combined-ca-bundle\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.975881 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-combined-ca-bundle\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976088 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976172 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data-custom\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976387 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-config-data\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976491 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-logs\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976573 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbvwn\" (UniqueName: \"kubernetes.io/projected/414bf196-06ed-4a99-99b0-493cac763e5e-kube-api-access-mbvwn\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976649 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976714 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mjprv\" (UniqueName: \"kubernetes.io/projected/65d1a29f-39b3-40d7-9db2-246fc05348cc-kube-api-access-mjprv\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.976747 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.977486 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-config-data-custom\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.978563 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-svc\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.980375 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-combined-ca-bundle\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.980612 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-swift-storage-0\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:07 crc kubenswrapper[4765]: I0319 10:43:07.983014 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65d1a29f-39b3-40d7-9db2-246fc05348cc-config-data\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.005128 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mjprv\" (UniqueName: \"kubernetes.io/projected/65d1a29f-39b3-40d7-9db2-246fc05348cc-kube-api-access-mjprv\") pod \"barbican-keystone-listener-6955bd84cd-t7qkv\" (UID: \"65d1a29f-39b3-40d7-9db2-246fc05348cc\") " pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.005386 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbvwn\" (UniqueName: \"kubernetes.io/projected/414bf196-06ed-4a99-99b0-493cac763e5e-kube-api-access-mbvwn\") pod \"dnsmasq-dns-7c67bffd47-sxpx9\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.058447 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.078333 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkg5z\" (UniqueName: \"kubernetes.io/projected/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-kube-api-access-dkg5z\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.078410 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-combined-ca-bundle\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.078471 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data-custom\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.078555 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-logs\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.078611 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.079366 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-logs\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.088795 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-combined-ca-bundle\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.090616 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.096752 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data-custom\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.098528 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkg5z\" (UniqueName: \"kubernetes.io/projected/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-kube-api-access-dkg5z\") pod \"barbican-api-74c69ffcbd-fntb4\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.105953 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.131541 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.560996 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-worker-7f975545dc-7gv92"] Mar 19 10:43:08 crc kubenswrapper[4765]: W0319 10:43:08.569909 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb18f0688_cbc1_49ae_a721_b964e45cc1ea.slice/crio-39bcdbbf75ccbec407d67affa02a7667cfc259cc67db8e9eef4e7ef46fe15ab2 WatchSource:0}: Error finding container 39bcdbbf75ccbec407d67affa02a7667cfc259cc67db8e9eef4e7ef46fe15ab2: Status 404 returned error can't find the container with id 39bcdbbf75ccbec407d67affa02a7667cfc259cc67db8e9eef4e7ef46fe15ab2 Mar 19 10:43:08 crc kubenswrapper[4765]: I0319 10:43:08.843707 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.000999 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-sxpx9"] Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.001277 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-combined-ca-bundle\") pod \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.001426 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fsdwl\" (UniqueName: \"kubernetes.io/projected/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-kube-api-access-fsdwl\") pod \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.001529 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-config\") pod \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\" (UID: \"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e\") " Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.018594 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-74c69ffcbd-fntb4"] Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.027657 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-kube-api-access-fsdwl" (OuterVolumeSpecName: "kube-api-access-fsdwl") pod "fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" (UID: "fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e"). InnerVolumeSpecName "kube-api-access-fsdwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.030766 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-keystone-listener-6955bd84cd-t7qkv"] Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.041101 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-config" (OuterVolumeSpecName: "config") pod "fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" (UID: "fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.047120 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" (UID: "fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.104251 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.104293 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fsdwl\" (UniqueName: \"kubernetes.io/projected/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-kube-api-access-fsdwl\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.104307 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.352705 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-db-sync-jjjh8" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.354055 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-db-sync-jjjh8" event={"ID":"fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e","Type":"ContainerDied","Data":"2b03b80b028fc4c98f9bf244f0d1b2cd85e26dede99cb3387a758fab567c0a2f"} Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.354100 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b03b80b028fc4c98f9bf244f0d1b2cd85e26dede99cb3387a758fab567c0a2f" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.357490 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f975545dc-7gv92" event={"ID":"b18f0688-cbc1-49ae-a721-b964e45cc1ea","Type":"ContainerStarted","Data":"39bcdbbf75ccbec407d67affa02a7667cfc259cc67db8e9eef4e7ef46fe15ab2"} Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.359662 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" event={"ID":"414bf196-06ed-4a99-99b0-493cac763e5e","Type":"ContainerStarted","Data":"c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e"} Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.359716 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" event={"ID":"414bf196-06ed-4a99-99b0-493cac763e5e","Type":"ContainerStarted","Data":"04a5db578582039fed258a17ee0c62c00dbcb7122d401e697e5f4d9f51cef0c8"} Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.383341 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" event={"ID":"65d1a29f-39b3-40d7-9db2-246fc05348cc","Type":"ContainerStarted","Data":"3d2685b0042bd139b6fe6fdd38259e5aeb1228352a20d7872828b0cef9983989"} Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.387886 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74c69ffcbd-fntb4" event={"ID":"bcf43de1-bffe-4810-bfb0-c6ff2c59020a","Type":"ContainerStarted","Data":"70759ac3af0142b361b7b7eb3782cae1528baa4e99b09eb8e979837547f4d818"} Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.387973 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74c69ffcbd-fntb4" event={"ID":"bcf43de1-bffe-4810-bfb0-c6ff2c59020a","Type":"ContainerStarted","Data":"f9fb5e8cf7da43f95d51a64857313918e6f0174334346da149cffff79fec459d"} Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.560838 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-sxpx9"] Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.632392 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-48d9j"] Mar 19 10:43:09 crc kubenswrapper[4765]: E0319 10:43:09.633115 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" containerName="neutron-db-sync" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.633157 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" containerName="neutron-db-sync" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.633474 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" containerName="neutron-db-sync" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.635352 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.656158 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-48d9j"] Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.729925 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-config\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.730336 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.730375 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.730516 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.730550 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r5b4r\" (UniqueName: \"kubernetes.io/projected/daa667b9-5f24-4ae0-8278-1585b136fc1d-kube-api-access-r5b4r\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.730623 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.759974 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-66fb7cb9f6-g7xpk"] Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.762208 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.768427 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-httpd-config" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.768757 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-ovndbs" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.768939 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-neutron-dockercfg-p7tmf" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.769120 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-config" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.778376 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66fb7cb9f6-g7xpk"] Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.833749 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-combined-ca-bundle\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.833819 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-ovndb-tls-certs\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.833894 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.833922 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r5b4r\" (UniqueName: \"kubernetes.io/projected/daa667b9-5f24-4ae0-8278-1585b136fc1d-kube-api-access-r5b4r\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.833946 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-httpd-config\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.834010 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2z27k\" (UniqueName: \"kubernetes.io/projected/ab7915d2-c641-481f-a9f6-1ce1209c7e17-kube-api-access-2z27k\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.834046 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.834098 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-config\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.834125 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.834142 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-config\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.834161 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.835213 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-svc\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.835247 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-sb\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.835733 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-swift-storage-0\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.838674 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-nb\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.845823 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-config\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.864302 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r5b4r\" (UniqueName: \"kubernetes.io/projected/daa667b9-5f24-4ae0-8278-1585b136fc1d-kube-api-access-r5b4r\") pod \"dnsmasq-dns-848cf88cfc-48d9j\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.938907 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-httpd-config\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.939010 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2z27k\" (UniqueName: \"kubernetes.io/projected/ab7915d2-c641-481f-a9f6-1ce1209c7e17-kube-api-access-2z27k\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.939073 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-config\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.939135 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-combined-ca-bundle\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.939160 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-ovndb-tls-certs\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.947068 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-ovndb-tls-certs\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.948720 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-config\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.949465 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-httpd-config\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.959229 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-combined-ca-bundle\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.969856 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2z27k\" (UniqueName: \"kubernetes.io/projected/ab7915d2-c641-481f-a9f6-1ce1209c7e17-kube-api-access-2z27k\") pod \"neutron-66fb7cb9f6-g7xpk\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:09 crc kubenswrapper[4765]: I0319 10:43:09.989586 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:10 crc kubenswrapper[4765]: I0319 10:43:10.118334 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:10 crc kubenswrapper[4765]: I0319 10:43:10.400004 4765 generic.go:334] "Generic (PLEG): container finished" podID="414bf196-06ed-4a99-99b0-493cac763e5e" containerID="c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e" exitCode=0 Mar 19 10:43:10 crc kubenswrapper[4765]: I0319 10:43:10.400144 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" event={"ID":"414bf196-06ed-4a99-99b0-493cac763e5e","Type":"ContainerDied","Data":"c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e"} Mar 19 10:43:10 crc kubenswrapper[4765]: I0319 10:43:10.403475 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74c69ffcbd-fntb4" event={"ID":"bcf43de1-bffe-4810-bfb0-c6ff2c59020a","Type":"ContainerStarted","Data":"702e682067397a234f29ad5a259464a4313162edb94817bf35554432126fcc98"} Mar 19 10:43:10 crc kubenswrapper[4765]: I0319 10:43:10.403593 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:10 crc kubenswrapper[4765]: I0319 10:43:10.403617 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:10 crc kubenswrapper[4765]: I0319 10:43:10.451734 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-74c69ffcbd-fntb4" podStartSLOduration=3.448831283 podStartE2EDuration="3.448831283s" podCreationTimestamp="2026-03-19 10:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:10.447499147 +0000 UTC m=+1288.796444689" watchObservedRunningTime="2026-03-19 10:43:10.448831283 +0000 UTC m=+1288.797776825" Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.336334 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-66fb7cb9f6-g7xpk"] Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.348134 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-48d9j"] Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.417557 4765 generic.go:334] "Generic (PLEG): container finished" podID="21c84c7a-03f0-4ab5-a259-95a351cbdf13" containerID="eb368f1410dc53f6a7bcd7b26fa29ac3838dee4fc9d92e44e8e090d0b483cfce" exitCode=0 Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.417631 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mbhfs" event={"ID":"21c84c7a-03f0-4ab5-a259-95a351cbdf13","Type":"ContainerDied","Data":"eb368f1410dc53f6a7bcd7b26fa29ac3838dee4fc9d92e44e8e090d0b483cfce"} Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.423402 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f975545dc-7gv92" event={"ID":"b18f0688-cbc1-49ae-a721-b964e45cc1ea","Type":"ContainerStarted","Data":"43ce82b375e22597ff0fd5828e2f34ed54c0e67b8236d56896feedb4c41755d7"} Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.428291 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" podUID="414bf196-06ed-4a99-99b0-493cac763e5e" containerName="dnsmasq-dns" containerID="cri-o://29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8" gracePeriod=10 Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.428381 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" event={"ID":"414bf196-06ed-4a99-99b0-493cac763e5e","Type":"ContainerStarted","Data":"29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8"} Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.428424 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:11 crc kubenswrapper[4765]: I0319 10:43:11.484866 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" podStartSLOduration=4.484846551 podStartE2EDuration="4.484846551s" podCreationTimestamp="2026-03-19 10:43:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:11.454333123 +0000 UTC m=+1289.803278675" watchObservedRunningTime="2026-03-19 10:43:11.484846551 +0000 UTC m=+1289.833792093" Mar 19 10:43:11 crc kubenswrapper[4765]: W0319 10:43:11.625696 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab7915d2_c641_481f_a9f6_1ce1209c7e17.slice/crio-d9a30db13c825fc0be017e8fb75f92508819e713cf7d8ca09ec58af79d3b329b WatchSource:0}: Error finding container d9a30db13c825fc0be017e8fb75f92508819e713cf7d8ca09ec58af79d3b329b: Status 404 returned error can't find the container with id d9a30db13c825fc0be017e8fb75f92508819e713cf7d8ca09ec58af79d3b329b Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.181682 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.306266 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-sb\") pod \"414bf196-06ed-4a99-99b0-493cac763e5e\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.306337 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-swift-storage-0\") pod \"414bf196-06ed-4a99-99b0-493cac763e5e\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.306452 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-config\") pod \"414bf196-06ed-4a99-99b0-493cac763e5e\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.306601 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-svc\") pod \"414bf196-06ed-4a99-99b0-493cac763e5e\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.306703 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-nb\") pod \"414bf196-06ed-4a99-99b0-493cac763e5e\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.306729 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mbvwn\" (UniqueName: \"kubernetes.io/projected/414bf196-06ed-4a99-99b0-493cac763e5e-kube-api-access-mbvwn\") pod \"414bf196-06ed-4a99-99b0-493cac763e5e\" (UID: \"414bf196-06ed-4a99-99b0-493cac763e5e\") " Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.336746 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/414bf196-06ed-4a99-99b0-493cac763e5e-kube-api-access-mbvwn" (OuterVolumeSpecName: "kube-api-access-mbvwn") pod "414bf196-06ed-4a99-99b0-493cac763e5e" (UID: "414bf196-06ed-4a99-99b0-493cac763e5e"). InnerVolumeSpecName "kube-api-access-mbvwn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.411214 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mbvwn\" (UniqueName: \"kubernetes.io/projected/414bf196-06ed-4a99-99b0-493cac763e5e-kube-api-access-mbvwn\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.457270 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-config" (OuterVolumeSpecName: "config") pod "414bf196-06ed-4a99-99b0-493cac763e5e" (UID: "414bf196-06ed-4a99-99b0-493cac763e5e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.475174 4765 generic.go:334] "Generic (PLEG): container finished" podID="414bf196-06ed-4a99-99b0-493cac763e5e" containerID="29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8" exitCode=0 Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.475240 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" event={"ID":"414bf196-06ed-4a99-99b0-493cac763e5e","Type":"ContainerDied","Data":"29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.475267 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" event={"ID":"414bf196-06ed-4a99-99b0-493cac763e5e","Type":"ContainerDied","Data":"04a5db578582039fed258a17ee0c62c00dbcb7122d401e697e5f4d9f51cef0c8"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.475381 4765 scope.go:117] "RemoveContainer" containerID="29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.475493 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-7c67bffd47-sxpx9" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.477541 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "414bf196-06ed-4a99-99b0-493cac763e5e" (UID: "414bf196-06ed-4a99-99b0-493cac763e5e"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.480319 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" event={"ID":"65d1a29f-39b3-40d7-9db2-246fc05348cc","Type":"ContainerStarted","Data":"456bfdb8ab1becd048dc2e505b764efe2d9748f2d603f5f069ceaa2d362df839"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.481950 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "414bf196-06ed-4a99-99b0-493cac763e5e" (UID: "414bf196-06ed-4a99-99b0-493cac763e5e"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.487617 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "414bf196-06ed-4a99-99b0-493cac763e5e" (UID: "414bf196-06ed-4a99-99b0-493cac763e5e"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.489988 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "414bf196-06ed-4a99-99b0-493cac763e5e" (UID: "414bf196-06ed-4a99-99b0-493cac763e5e"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.491469 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb7cb9f6-g7xpk" event={"ID":"ab7915d2-c641-481f-a9f6-1ce1209c7e17","Type":"ContainerStarted","Data":"49b4eaf7fb3307781adb143ca4b6178181d60e6e330bab6ca84d5ec1f928af9b"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.491508 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb7cb9f6-g7xpk" event={"ID":"ab7915d2-c641-481f-a9f6-1ce1209c7e17","Type":"ContainerStarted","Data":"402155193e5eb4df46bcd8d42397b51050e1b818f7e03209b35eab08249dd7aa"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.491520 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb7cb9f6-g7xpk" event={"ID":"ab7915d2-c641-481f-a9f6-1ce1209c7e17","Type":"ContainerStarted","Data":"d9a30db13c825fc0be017e8fb75f92508819e713cf7d8ca09ec58af79d3b329b"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.492295 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.506127 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-worker-7f975545dc-7gv92" event={"ID":"b18f0688-cbc1-49ae-a721-b964e45cc1ea","Type":"ContainerStarted","Data":"901fb7fc6fd3fa2a4c177a329c248da3a65764fd0af4f47c835941784fd785b9"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.514825 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.514872 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.514885 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.514902 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.514916 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/414bf196-06ed-4a99-99b0-493cac763e5e-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.518334 4765 generic.go:334] "Generic (PLEG): container finished" podID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerID="34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc" exitCode=0 Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.518716 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" event={"ID":"daa667b9-5f24-4ae0-8278-1585b136fc1d","Type":"ContainerDied","Data":"34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.518789 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" event={"ID":"daa667b9-5f24-4ae0-8278-1585b136fc1d","Type":"ContainerStarted","Data":"1f1f8b877faa07885c01a15f4a0da8d37a501735d6899012f0fde2d48e935e2e"} Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.524500 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-66fb7cb9f6-g7xpk" podStartSLOduration=3.524474698 podStartE2EDuration="3.524474698s" podCreationTimestamp="2026-03-19 10:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:12.514422085 +0000 UTC m=+1290.863367647" watchObservedRunningTime="2026-03-19 10:43:12.524474698 +0000 UTC m=+1290.873420240" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.565525 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-worker-7f975545dc-7gv92" podStartSLOduration=3.431048769 podStartE2EDuration="5.565502841s" podCreationTimestamp="2026-03-19 10:43:07 +0000 UTC" firstStartedPulling="2026-03-19 10:43:08.571836854 +0000 UTC m=+1286.920782396" lastFinishedPulling="2026-03-19 10:43:10.706290926 +0000 UTC m=+1289.055236468" observedRunningTime="2026-03-19 10:43:12.540253816 +0000 UTC m=+1290.889199368" watchObservedRunningTime="2026-03-19 10:43:12.565502841 +0000 UTC m=+1290.914448383" Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.827681 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-sxpx9"] Mar 19 10:43:12 crc kubenswrapper[4765]: I0319 10:43:12.836865 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-7c67bffd47-sxpx9"] Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.094794 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-67b57ccc79-wx8k9"] Mar 19 10:43:13 crc kubenswrapper[4765]: E0319 10:43:13.095383 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414bf196-06ed-4a99-99b0-493cac763e5e" containerName="dnsmasq-dns" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.095407 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="414bf196-06ed-4a99-99b0-493cac763e5e" containerName="dnsmasq-dns" Mar 19 10:43:13 crc kubenswrapper[4765]: E0319 10:43:13.095437 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="414bf196-06ed-4a99-99b0-493cac763e5e" containerName="init" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.095445 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="414bf196-06ed-4a99-99b0-493cac763e5e" containerName="init" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.095677 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="414bf196-06ed-4a99-99b0-493cac763e5e" containerName="dnsmasq-dns" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.096857 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.108535 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-public-svc" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.108808 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-neutron-internal-svc" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.122924 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67b57ccc79-wx8k9"] Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.232650 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2c27w\" (UniqueName: \"kubernetes.io/projected/121bed92-a505-40d7-83f1-f3163088df2a-kube-api-access-2c27w\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.232759 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-config\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.232834 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-combined-ca-bundle\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.232928 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-public-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.233661 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-httpd-config\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.233699 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-internal-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.233925 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-ovndb-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.335532 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-config\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.335581 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-combined-ca-bundle\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.335603 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-public-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.335632 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-httpd-config\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.335650 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-internal-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.335702 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-ovndb-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.335751 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2c27w\" (UniqueName: \"kubernetes.io/projected/121bed92-a505-40d7-83f1-f3163088df2a-kube-api-access-2c27w\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.340704 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-combined-ca-bundle\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.341347 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-config\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.355226 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-internal-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.356731 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-ovndb-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.356875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-httpd-config\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.357630 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/121bed92-a505-40d7-83f1-f3163088df2a-public-tls-certs\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.362679 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2c27w\" (UniqueName: \"kubernetes.io/projected/121bed92-a505-40d7-83f1-f3163088df2a-kube-api-access-2c27w\") pod \"neutron-67b57ccc79-wx8k9\" (UID: \"121bed92-a505-40d7-83f1-f3163088df2a\") " pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:13 crc kubenswrapper[4765]: I0319 10:43:13.470324 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.371675 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="414bf196-06ed-4a99-99b0-493cac763e5e" path="/var/lib/kubelet/pods/414bf196-06ed-4a99-99b0-493cac763e5e/volumes" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.481835 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/barbican-api-77865d778-4kfkp"] Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.487810 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.491587 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-public-svc" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.494068 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77865d778-4kfkp"] Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.500869 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-barbican-internal-svc" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.562773 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-combined-ca-bundle\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.562865 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-internal-tls-certs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.562913 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-public-tls-certs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.562988 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446b5005-1960-413b-8ab2-f0da071ab4ba-logs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.563032 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-config-data-custom\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.563076 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-config-data\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.563120 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lnbff\" (UniqueName: \"kubernetes.io/projected/446b5005-1960-413b-8ab2-f0da071ab4ba-kube-api-access-lnbff\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.664528 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446b5005-1960-413b-8ab2-f0da071ab4ba-logs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.664601 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-config-data-custom\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.664811 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-config-data\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.664879 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lnbff\" (UniqueName: \"kubernetes.io/projected/446b5005-1960-413b-8ab2-f0da071ab4ba-kube-api-access-lnbff\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.664930 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-combined-ca-bundle\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.665018 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-internal-tls-certs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.665058 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-public-tls-certs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.665994 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/446b5005-1960-413b-8ab2-f0da071ab4ba-logs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.674772 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-public-tls-certs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.683321 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-config-data-custom\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.685065 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-combined-ca-bundle\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.685457 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-internal-tls-certs\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.695495 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lnbff\" (UniqueName: \"kubernetes.io/projected/446b5005-1960-413b-8ab2-f0da071ab4ba-kube-api-access-lnbff\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.702479 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/446b5005-1960-413b-8ab2-f0da071ab4ba-config-data\") pod \"barbican-api-77865d778-4kfkp\" (UID: \"446b5005-1960-413b-8ab2-f0da071ab4ba\") " pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:14 crc kubenswrapper[4765]: I0319 10:43:14.810289 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:15 crc kubenswrapper[4765]: I0319 10:43:15.232459 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/horizon-6c6ff5646d-fmdz2" podUID="b506e362-44bf-4267-bea0-18131aa011fa" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.153:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.153:8443: connect: connection refused" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.654418 4765 scope.go:117] "RemoveContainer" containerID="c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.787041 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.914264 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-config-data\") pod \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.914441 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-combined-ca-bundle\") pod \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.914482 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tn9jt\" (UniqueName: \"kubernetes.io/projected/21c84c7a-03f0-4ab5-a259-95a351cbdf13-kube-api-access-tn9jt\") pod \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.914521 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-scripts\") pod \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.914546 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-db-sync-config-data\") pod \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.914622 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c84c7a-03f0-4ab5-a259-95a351cbdf13-etc-machine-id\") pod \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\" (UID: \"21c84c7a-03f0-4ab5-a259-95a351cbdf13\") " Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.914941 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/21c84c7a-03f0-4ab5-a259-95a351cbdf13-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "21c84c7a-03f0-4ab5-a259-95a351cbdf13" (UID: "21c84c7a-03f0-4ab5-a259-95a351cbdf13"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.915233 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/21c84c7a-03f0-4ab5-a259-95a351cbdf13-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.924335 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "21c84c7a-03f0-4ab5-a259-95a351cbdf13" (UID: "21c84c7a-03f0-4ab5-a259-95a351cbdf13"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.927842 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-scripts" (OuterVolumeSpecName: "scripts") pod "21c84c7a-03f0-4ab5-a259-95a351cbdf13" (UID: "21c84c7a-03f0-4ab5-a259-95a351cbdf13"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.928296 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/21c84c7a-03f0-4ab5-a259-95a351cbdf13-kube-api-access-tn9jt" (OuterVolumeSpecName: "kube-api-access-tn9jt") pod "21c84c7a-03f0-4ab5-a259-95a351cbdf13" (UID: "21c84c7a-03f0-4ab5-a259-95a351cbdf13"). InnerVolumeSpecName "kube-api-access-tn9jt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.950635 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "21c84c7a-03f0-4ab5-a259-95a351cbdf13" (UID: "21c84c7a-03f0-4ab5-a259-95a351cbdf13"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:16 crc kubenswrapper[4765]: I0319 10:43:16.981373 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-config-data" (OuterVolumeSpecName: "config-data") pod "21c84c7a-03f0-4ab5-a259-95a351cbdf13" (UID: "21c84c7a-03f0-4ab5-a259-95a351cbdf13"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.016896 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tn9jt\" (UniqueName: \"kubernetes.io/projected/21c84c7a-03f0-4ab5-a259-95a351cbdf13-kube-api-access-tn9jt\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.016939 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.016972 4765 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.016985 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.016996 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/21c84c7a-03f0-4ab5-a259-95a351cbdf13-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.118624 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.576261 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-db-sync-mbhfs" event={"ID":"21c84c7a-03f0-4ab5-a259-95a351cbdf13","Type":"ContainerDied","Data":"0c37c357db57e9b538ce00318541aba7846373220b57be0d3e67f96109aaaf3c"} Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.576301 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0c37c357db57e9b538ce00318541aba7846373220b57be0d3e67f96109aaaf3c" Mar 19 10:43:17 crc kubenswrapper[4765]: I0319 10:43:17.576282 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-db-sync-mbhfs" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.025804 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:18 crc kubenswrapper[4765]: E0319 10:43:18.049868 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="21c84c7a-03f0-4ab5-a259-95a351cbdf13" containerName="cinder-db-sync" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.049931 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="21c84c7a-03f0-4ab5-a259-95a351cbdf13" containerName="cinder-db-sync" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.050408 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="21c84c7a-03f0-4ab5-a259-95a351cbdf13" containerName="cinder-db-sync" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.083414 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.083546 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.095175 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scripts" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.095436 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.096301 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-cinder-dockercfg-7z8jd" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.096656 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-config-data" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.126200 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-48d9j"] Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.162998 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.163115 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e45ebb9b-6922-446d-a35d-b659217c5ef9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.164595 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.164892 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lb77z\" (UniqueName: \"kubernetes.io/projected/e45ebb9b-6922-446d-a35d-b659217c5ef9-kube-api-access-lb77z\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.165490 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-scripts\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.165574 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.180428 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-v7b7s"] Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.183404 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.206564 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-v7b7s"] Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272011 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-scripts\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272061 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272101 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-config\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272120 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272221 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272241 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-svc\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272255 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272288 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q4ds4\" (UniqueName: \"kubernetes.io/projected/630c00dd-9d08-4035-88a2-0533792f2118-kube-api-access-q4ds4\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272310 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e45ebb9b-6922-446d-a35d-b659217c5ef9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272332 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272360 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lb77z\" (UniqueName: \"kubernetes.io/projected/e45ebb9b-6922-446d-a35d-b659217c5ef9-kube-api-access-lb77z\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.272376 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.273304 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e45ebb9b-6922-446d-a35d-b659217c5ef9-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.279356 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.279805 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-scripts\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.290525 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.291237 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.296721 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lb77z\" (UniqueName: \"kubernetes.io/projected/e45ebb9b-6922-446d-a35d-b659217c5ef9-kube-api-access-lb77z\") pod \"cinder-scheduler-0\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.352238 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.353691 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.360905 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.373858 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-svc\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.373913 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.373981 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q4ds4\" (UniqueName: \"kubernetes.io/projected/630c00dd-9d08-4035-88a2-0533792f2118-kube-api-access-q4ds4\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.374049 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.374122 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-config\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.374144 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.378762 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-svc\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.379468 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-sb\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.380085 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-nb\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.380432 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-config\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.380755 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-swift-storage-0\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.396440 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.411612 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q4ds4\" (UniqueName: \"kubernetes.io/projected/630c00dd-9d08-4035-88a2-0533792f2118-kube-api-access-q4ds4\") pod \"dnsmasq-dns-6578955fd5-v7b7s\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.437521 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.475558 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f567275e-0c40-4ef2-8c5f-fb40aad223f8-logs\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.475612 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data-custom\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.475736 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4wrv8\" (UniqueName: \"kubernetes.io/projected/f567275e-0c40-4ef2-8c5f-fb40aad223f8-kube-api-access-4wrv8\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.475769 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-scripts\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.475801 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.475917 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f567275e-0c40-4ef2-8c5f-fb40aad223f8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.476012 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.535074 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.577922 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f567275e-0c40-4ef2-8c5f-fb40aad223f8-logs\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.577989 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data-custom\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.578011 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4wrv8\" (UniqueName: \"kubernetes.io/projected/f567275e-0c40-4ef2-8c5f-fb40aad223f8-kube-api-access-4wrv8\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.578029 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-scripts\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.578053 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.578131 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f567275e-0c40-4ef2-8c5f-fb40aad223f8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.578180 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.578241 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f567275e-0c40-4ef2-8c5f-fb40aad223f8-etc-machine-id\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.578437 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f567275e-0c40-4ef2-8c5f-fb40aad223f8-logs\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.582670 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.583191 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.584152 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data-custom\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.585304 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-scripts\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.598528 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4wrv8\" (UniqueName: \"kubernetes.io/projected/f567275e-0c40-4ef2-8c5f-fb40aad223f8-kube-api-access-4wrv8\") pod \"cinder-api-0\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " pod="openstack/cinder-api-0" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.644817 4765 scope.go:117] "RemoveContainer" containerID="29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8" Mar 19 10:43:18 crc kubenswrapper[4765]: E0319 10:43:18.646234 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8\": container with ID starting with 29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8 not found: ID does not exist" containerID="29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.646279 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8"} err="failed to get container status \"29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8\": rpc error: code = NotFound desc = could not find container \"29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8\": container with ID starting with 29e8cab27c25734a862edff8e07ba72fa6929ae65638db09d8e422c4763162e8 not found: ID does not exist" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.646307 4765 scope.go:117] "RemoveContainer" containerID="c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e" Mar 19 10:43:18 crc kubenswrapper[4765]: E0319 10:43:18.650368 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e\": container with ID starting with c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e not found: ID does not exist" containerID="c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.650397 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e"} err="failed to get container status \"c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e\": rpc error: code = NotFound desc = could not find container \"c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e\": container with ID starting with c4cad0119cdeee16d622032a335bea3e96eab74f7345d5c781d9b4b953c6bd6e not found: ID does not exist" Mar 19 10:43:18 crc kubenswrapper[4765]: I0319 10:43:18.773734 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.522277 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-67b57ccc79-wx8k9"] Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.533185 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:19 crc kubenswrapper[4765]: W0319 10:43:19.561114 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode45ebb9b_6922_446d_a35d_b659217c5ef9.slice/crio-e46892dc8e6a42351183ba5a1f7f97989e6b99f77ddb96d220f23d87c8b3e6f5 WatchSource:0}: Error finding container e46892dc8e6a42351183ba5a1f7f97989e6b99f77ddb96d220f23d87c8b3e6f5: Status 404 returned error can't find the container with id e46892dc8e6a42351183ba5a1f7f97989e6b99f77ddb96d220f23d87c8b3e6f5 Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.562742 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-v7b7s"] Mar 19 10:43:19 crc kubenswrapper[4765]: W0319 10:43:19.577819 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod630c00dd_9d08_4035_88a2_0533792f2118.slice/crio-a88904a133b3aff5a39e966448c7d4a16e8bb1f15038068d61889d0b1e5f50d4 WatchSource:0}: Error finding container a88904a133b3aff5a39e966448c7d4a16e8bb1f15038068d61889d0b1e5f50d4: Status 404 returned error can't find the container with id a88904a133b3aff5a39e966448c7d4a16e8bb1f15038068d61889d0b1e5f50d4 Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.617882 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" event={"ID":"630c00dd-9d08-4035-88a2-0533792f2118","Type":"ContainerStarted","Data":"a88904a133b3aff5a39e966448c7d4a16e8bb1f15038068d61889d0b1e5f50d4"} Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.625767 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67b57ccc79-wx8k9" event={"ID":"121bed92-a505-40d7-83f1-f3163088df2a","Type":"ContainerStarted","Data":"e87559a990ea049c4d551a76515f54635258cbb8d06f28c1be93156a45dce9a0"} Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.633596 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e45ebb9b-6922-446d-a35d-b659217c5ef9","Type":"ContainerStarted","Data":"e46892dc8e6a42351183ba5a1f7f97989e6b99f77ddb96d220f23d87c8b3e6f5"} Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.637102 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerStarted","Data":"db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05"} Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.637389 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-central-agent" containerID="cri-o://b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e" gracePeriod=30 Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.637565 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.637631 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="proxy-httpd" containerID="cri-o://db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05" gracePeriod=30 Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.637682 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="sg-core" containerID="cri-o://c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c" gracePeriod=30 Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.637720 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-notification-agent" containerID="cri-o://4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210" gracePeriod=30 Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.681826 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.944978536 podStartE2EDuration="1m4.681801169s" podCreationTimestamp="2026-03-19 10:42:15 +0000 UTC" firstStartedPulling="2026-03-19 10:42:17.011837464 +0000 UTC m=+1235.360783006" lastFinishedPulling="2026-03-19 10:43:18.748660097 +0000 UTC m=+1297.097605639" observedRunningTime="2026-03-19 10:43:19.675222255 +0000 UTC m=+1298.024167807" watchObservedRunningTime="2026-03-19 10:43:19.681801169 +0000 UTC m=+1298.030746721" Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.682175 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" event={"ID":"65d1a29f-39b3-40d7-9db2-246fc05348cc","Type":"ContainerStarted","Data":"433022c52e125a2106d1d1fc2c2d51ff30dca269a7b5e21d8eef89170f64a507"} Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.695310 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" event={"ID":"daa667b9-5f24-4ae0-8278-1585b136fc1d","Type":"ContainerStarted","Data":"93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0"} Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.695495 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" podUID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerName="dnsmasq-dns" containerID="cri-o://93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0" gracePeriod=10 Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.695574 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.707538 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-keystone-listener-6955bd84cd-t7qkv" podStartSLOduration=10.031431631 podStartE2EDuration="12.707515061s" podCreationTimestamp="2026-03-19 10:43:07 +0000 UTC" firstStartedPulling="2026-03-19 10:43:09.044272238 +0000 UTC m=+1287.393217780" lastFinishedPulling="2026-03-19 10:43:11.720355668 +0000 UTC m=+1290.069301210" observedRunningTime="2026-03-19 10:43:19.704518557 +0000 UTC m=+1298.053464099" watchObservedRunningTime="2026-03-19 10:43:19.707515061 +0000 UTC m=+1298.056460603" Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.744131 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/barbican-api-77865d778-4kfkp"] Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.764374 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" podStartSLOduration=10.764353385 podStartE2EDuration="10.764353385s" podCreationTimestamp="2026-03-19 10:43:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:19.759003485 +0000 UTC m=+1298.107949027" watchObservedRunningTime="2026-03-19 10:43:19.764353385 +0000 UTC m=+1298.113298927" Mar 19 10:43:19 crc kubenswrapper[4765]: I0319 10:43:19.921098 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.111160 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.524668 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.548357 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-nb\") pod \"daa667b9-5f24-4ae0-8278-1585b136fc1d\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.548476 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-config\") pod \"daa667b9-5f24-4ae0-8278-1585b136fc1d\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.548566 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-swift-storage-0\") pod \"daa667b9-5f24-4ae0-8278-1585b136fc1d\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.548611 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-sb\") pod \"daa667b9-5f24-4ae0-8278-1585b136fc1d\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.548664 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r5b4r\" (UniqueName: \"kubernetes.io/projected/daa667b9-5f24-4ae0-8278-1585b136fc1d-kube-api-access-r5b4r\") pod \"daa667b9-5f24-4ae0-8278-1585b136fc1d\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.548707 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-svc\") pod \"daa667b9-5f24-4ae0-8278-1585b136fc1d\" (UID: \"daa667b9-5f24-4ae0-8278-1585b136fc1d\") " Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.609394 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/daa667b9-5f24-4ae0-8278-1585b136fc1d-kube-api-access-r5b4r" (OuterVolumeSpecName: "kube-api-access-r5b4r") pod "daa667b9-5f24-4ae0-8278-1585b136fc1d" (UID: "daa667b9-5f24-4ae0-8278-1585b136fc1d"). InnerVolumeSpecName "kube-api-access-r5b4r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.652192 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r5b4r\" (UniqueName: \"kubernetes.io/projected/daa667b9-5f24-4ae0-8278-1585b136fc1d-kube-api-access-r5b4r\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.656874 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "daa667b9-5f24-4ae0-8278-1585b136fc1d" (UID: "daa667b9-5f24-4ae0-8278-1585b136fc1d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.711050 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "daa667b9-5f24-4ae0-8278-1585b136fc1d" (UID: "daa667b9-5f24-4ae0-8278-1585b136fc1d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.711284 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-config" (OuterVolumeSpecName: "config") pod "daa667b9-5f24-4ae0-8278-1585b136fc1d" (UID: "daa667b9-5f24-4ae0-8278-1585b136fc1d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.756041 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.756084 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.756107 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.761635 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "daa667b9-5f24-4ae0-8278-1585b136fc1d" (UID: "daa667b9-5f24-4ae0-8278-1585b136fc1d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.764944 4765 generic.go:334] "Generic (PLEG): container finished" podID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerID="93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0" exitCode=0 Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.765122 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" event={"ID":"daa667b9-5f24-4ae0-8278-1585b136fc1d","Type":"ContainerDied","Data":"93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.765169 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" event={"ID":"daa667b9-5f24-4ae0-8278-1585b136fc1d","Type":"ContainerDied","Data":"1f1f8b877faa07885c01a15f4a0da8d37a501735d6899012f0fde2d48e935e2e"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.765191 4765 scope.go:117] "RemoveContainer" containerID="93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.765352 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-848cf88cfc-48d9j" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.803404 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "daa667b9-5f24-4ae0-8278-1585b136fc1d" (UID: "daa667b9-5f24-4ae0-8278-1585b136fc1d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.803614 4765 generic.go:334] "Generic (PLEG): container finished" podID="630c00dd-9d08-4035-88a2-0533792f2118" containerID="fd3dfdf815374707e8055bce011c2332d13dcdaaae50317da5e096bedf59b64e" exitCode=0 Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.803703 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" event={"ID":"630c00dd-9d08-4035-88a2-0533792f2118","Type":"ContainerDied","Data":"fd3dfdf815374707e8055bce011c2332d13dcdaaae50317da5e096bedf59b64e"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.822923 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f567275e-0c40-4ef2-8c5f-fb40aad223f8","Type":"ContainerStarted","Data":"23b661699f9206411e682dfbf2c1e896ea6afe4e7c932d35aa1382d0bcce9704"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.851381 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77865d778-4kfkp" event={"ID":"446b5005-1960-413b-8ab2-f0da071ab4ba","Type":"ContainerStarted","Data":"b614ef6c85332211f57a4d3694791e7e217d8ef7f018422a2011fc4dddc13690"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.851439 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77865d778-4kfkp" event={"ID":"446b5005-1960-413b-8ab2-f0da071ab4ba","Type":"ContainerStarted","Data":"dfcec73c58a3664cc3416110f4163ae453a5018ebbe9398f047b7c8ad1b30a94"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.857713 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.857747 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/daa667b9-5f24-4ae0-8278-1585b136fc1d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.865175 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67b57ccc79-wx8k9" event={"ID":"121bed92-a505-40d7-83f1-f3163088df2a","Type":"ContainerStarted","Data":"3545cd2e737ca88c8553fc53a970f72aeadc2997660b08afbf5bf1df65ee5317"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.901269 4765 generic.go:334] "Generic (PLEG): container finished" podID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerID="db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05" exitCode=0 Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.901301 4765 generic.go:334] "Generic (PLEG): container finished" podID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerID="c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c" exitCode=2 Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.901309 4765 generic.go:334] "Generic (PLEG): container finished" podID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerID="b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e" exitCode=0 Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.901338 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerDied","Data":"db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.901377 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerDied","Data":"c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c"} Mar 19 10:43:20 crc kubenswrapper[4765]: I0319 10:43:20.901391 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerDied","Data":"b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e"} Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.008934 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.017599 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.127231 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-48d9j"] Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.143850 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-848cf88cfc-48d9j"] Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.159839 4765 scope.go:117] "RemoveContainer" containerID="34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.347565 4765 scope.go:117] "RemoveContainer" containerID="93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0" Mar 19 10:43:21 crc kubenswrapper[4765]: E0319 10:43:21.351170 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0\": container with ID starting with 93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0 not found: ID does not exist" containerID="93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.351229 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0"} err="failed to get container status \"93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0\": rpc error: code = NotFound desc = could not find container \"93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0\": container with ID starting with 93bbaad867dd486e00b9ed8d9ccac85be775e664ae79b0f49e545b4fbea33ca0 not found: ID does not exist" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.351259 4765 scope.go:117] "RemoveContainer" containerID="34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc" Mar 19 10:43:21 crc kubenswrapper[4765]: E0319 10:43:21.352219 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc\": container with ID starting with 34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc not found: ID does not exist" containerID="34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.352247 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc"} err="failed to get container status \"34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc\": rpc error: code = NotFound desc = could not find container \"34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc\": container with ID starting with 34fd6dbec025944f2266c0c39a65eaf3da9bdbabac03b4d5776fc72f5490addc not found: ID does not exist" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.937175 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f567275e-0c40-4ef2-8c5f-fb40aad223f8","Type":"ContainerStarted","Data":"e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7"} Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.946759 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" event={"ID":"630c00dd-9d08-4035-88a2-0533792f2118","Type":"ContainerStarted","Data":"ae106f7b6a7ef53eb6414ea44327434d663771da87646889db8d4223430fd2ed"} Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.951869 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.956869 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-77865d778-4kfkp" event={"ID":"446b5005-1960-413b-8ab2-f0da071ab4ba","Type":"ContainerStarted","Data":"5bc3ecf37af602c18e7caec35c77d76c141f4ebdd98a7647e99f759da19d8b9f"} Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.958770 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.959091 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.969342 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-67b57ccc79-wx8k9" event={"ID":"121bed92-a505-40d7-83f1-f3163088df2a","Type":"ContainerStarted","Data":"0144f8a428e9a4387f29795b1777fca5757fd4992489953d639ba881f028ac73"} Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.971603 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:21 crc kubenswrapper[4765]: I0319 10:43:21.998555 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" podStartSLOduration=3.998524146 podStartE2EDuration="3.998524146s" podCreationTimestamp="2026-03-19 10:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:21.98656594 +0000 UTC m=+1300.335511482" watchObservedRunningTime="2026-03-19 10:43:21.998524146 +0000 UTC m=+1300.347469688" Mar 19 10:43:22 crc kubenswrapper[4765]: I0319 10:43:22.036135 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/barbican-api-77865d778-4kfkp" podStartSLOduration=8.03610721 podStartE2EDuration="8.03610721s" podCreationTimestamp="2026-03-19 10:43:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:22.022619782 +0000 UTC m=+1300.371565334" watchObservedRunningTime="2026-03-19 10:43:22.03610721 +0000 UTC m=+1300.385052782" Mar 19 10:43:22 crc kubenswrapper[4765]: I0319 10:43:22.063392 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-67b57ccc79-wx8k9" podStartSLOduration=9.063369355 podStartE2EDuration="9.063369355s" podCreationTimestamp="2026-03-19 10:43:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:22.055822693 +0000 UTC m=+1300.404768235" watchObservedRunningTime="2026-03-19 10:43:22.063369355 +0000 UTC m=+1300.412314897" Mar 19 10:43:22 crc kubenswrapper[4765]: I0319 10:43:22.236051 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:22 crc kubenswrapper[4765]: I0319 10:43:22.394405 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="daa667b9-5f24-4ae0-8278-1585b136fc1d" path="/var/lib/kubelet/pods/daa667b9-5f24-4ae0-8278-1585b136fc1d/volumes" Mar 19 10:43:23 crc kubenswrapper[4765]: I0319 10:43:23.003436 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f567275e-0c40-4ef2-8c5f-fb40aad223f8","Type":"ContainerStarted","Data":"1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf"} Mar 19 10:43:23 crc kubenswrapper[4765]: I0319 10:43:23.004347 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 10:43:23 crc kubenswrapper[4765]: I0319 10:43:23.003739 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api-log" containerID="cri-o://e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7" gracePeriod=30 Mar 19 10:43:23 crc kubenswrapper[4765]: I0319 10:43:23.004351 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-api-0" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api" containerID="cri-o://1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf" gracePeriod=30 Mar 19 10:43:23 crc kubenswrapper[4765]: I0319 10:43:23.013341 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e45ebb9b-6922-446d-a35d-b659217c5ef9","Type":"ContainerStarted","Data":"e31d9a59a0246db0d2c4c7fa1c458e96d17a7772f13ed898afffd6e139750ceb"} Mar 19 10:43:23 crc kubenswrapper[4765]: I0319 10:43:23.038799 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=5.038734661 podStartE2EDuration="5.038734661s" podCreationTimestamp="2026-03-19 10:43:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:23.034321627 +0000 UTC m=+1301.383267169" watchObservedRunningTime="2026-03-19 10:43:23.038734661 +0000 UTC m=+1301.387680203" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.026338 4765 generic.go:334] "Generic (PLEG): container finished" podID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerID="e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7" exitCode=143 Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.026449 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f567275e-0c40-4ef2-8c5f-fb40aad223f8","Type":"ContainerDied","Data":"e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7"} Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.029736 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e45ebb9b-6922-446d-a35d-b659217c5ef9","Type":"ContainerStarted","Data":"ce3894d5acdf42457cd08ecc4a82f25a59317a8f34cbe51394e3aaf3524c56d1"} Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.053168 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=4.364276516 podStartE2EDuration="6.053144594s" podCreationTimestamp="2026-03-19 10:43:18 +0000 UTC" firstStartedPulling="2026-03-19 10:43:19.565040025 +0000 UTC m=+1297.913985557" lastFinishedPulling="2026-03-19 10:43:21.253908093 +0000 UTC m=+1299.602853635" observedRunningTime="2026-03-19 10:43:24.048832273 +0000 UTC m=+1302.397777825" watchObservedRunningTime="2026-03-19 10:43:24.053144594 +0000 UTC m=+1302.402090136" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.683929 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.747645 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.747726 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-sg-core-conf-yaml\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.747751 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vm6s\" (UniqueName: \"kubernetes.io/projected/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-kube-api-access-9vm6s\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.747793 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-scripts\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.747828 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-run-httpd\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.747902 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-log-httpd\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.748066 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-combined-ca-bundle\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.748645 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.748709 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.755358 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-kube-api-access-9vm6s" (OuterVolumeSpecName: "kube-api-access-9vm6s") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "kube-api-access-9vm6s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.767209 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-scripts" (OuterVolumeSpecName: "scripts") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.781387 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.828228 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.849237 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data" (OuterVolumeSpecName: "config-data") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.849367 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data\") pod \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\" (UID: \"fc731af7-c5a0-4d4e-9f33-9deec0f322ee\") " Mar 19 10:43:24 crc kubenswrapper[4765]: W0319 10:43:24.849712 4765 empty_dir.go:500] Warning: Unmount skipped because path does not exist: /var/lib/kubelet/pods/fc731af7-c5a0-4d4e-9f33-9deec0f322ee/volumes/kubernetes.io~secret/config-data Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.849731 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data" (OuterVolumeSpecName: "config-data") pod "fc731af7-c5a0-4d4e-9f33-9deec0f322ee" (UID: "fc731af7-c5a0-4d4e-9f33-9deec0f322ee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.850190 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.850248 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.850268 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vm6s\" (UniqueName: \"kubernetes.io/projected/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-kube-api-access-9vm6s\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.850285 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.850297 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.850311 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:24 crc kubenswrapper[4765]: I0319 10:43:24.850322 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc731af7-c5a0-4d4e-9f33-9deec0f322ee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.041121 4765 generic.go:334] "Generic (PLEG): container finished" podID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerID="4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210" exitCode=0 Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.041204 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.041226 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerDied","Data":"4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210"} Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.041283 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"fc731af7-c5a0-4d4e-9f33-9deec0f322ee","Type":"ContainerDied","Data":"404ca7642462232243d4052d8da91384de2d1401b2c4aae33dc6940f76aa75ab"} Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.041307 4765 scope.go:117] "RemoveContainer" containerID="db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.064508 4765 scope.go:117] "RemoveContainer" containerID="c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.086446 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.104156 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.104336 4765 scope.go:117] "RemoveContainer" containerID="4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116030 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.116408 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-central-agent" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116422 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-central-agent" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.116431 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerName="dnsmasq-dns" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116439 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerName="dnsmasq-dns" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.116455 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-notification-agent" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116463 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-notification-agent" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.116484 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="sg-core" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116491 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="sg-core" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.116506 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerName="init" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116513 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerName="init" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.116523 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="proxy-httpd" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116531 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="proxy-httpd" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116748 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="proxy-httpd" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116768 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-central-agent" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116782 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="sg-core" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116793 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" containerName="ceilometer-notification-agent" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.116803 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="daa667b9-5f24-4ae0-8278-1585b136fc1d" containerName="dnsmasq-dns" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.118473 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.122876 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.122999 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.143069 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.190404 4765 scope.go:117] "RemoveContainer" containerID="b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.223188 4765 scope.go:117] "RemoveContainer" containerID="db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.223787 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05\": container with ID starting with db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05 not found: ID does not exist" containerID="db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.223865 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05"} err="failed to get container status \"db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05\": rpc error: code = NotFound desc = could not find container \"db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05\": container with ID starting with db71f055d416d4d6c3a3c2046b4b2bd204dffd6df77e310bdc4d4d0d9dba7e05 not found: ID does not exist" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.223896 4765 scope.go:117] "RemoveContainer" containerID="c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.224263 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c\": container with ID starting with c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c not found: ID does not exist" containerID="c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.224307 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c"} err="failed to get container status \"c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c\": rpc error: code = NotFound desc = could not find container \"c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c\": container with ID starting with c25af2fc24add53e6b2bf0a6e5ecc96f539fac4f29346303ea42f3afd37f9d2c not found: ID does not exist" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.224335 4765 scope.go:117] "RemoveContainer" containerID="4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.224784 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210\": container with ID starting with 4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210 not found: ID does not exist" containerID="4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.224866 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210"} err="failed to get container status \"4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210\": rpc error: code = NotFound desc = could not find container \"4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210\": container with ID starting with 4f334742b0e1a014b834b4129a622ca5ef19d7ed5b3555050a9039cf73ec1210 not found: ID does not exist" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.224894 4765 scope.go:117] "RemoveContainer" containerID="b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e" Mar 19 10:43:25 crc kubenswrapper[4765]: E0319 10:43:25.225296 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e\": container with ID starting with b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e not found: ID does not exist" containerID="b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.225340 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e"} err="failed to get container status \"b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e\": rpc error: code = NotFound desc = could not find container \"b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e\": container with ID starting with b1e3752be6605f29ac474e229120c00f527829b9f8aa536add5a101957307f0e not found: ID does not exist" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.264315 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tnjqj\" (UniqueName: \"kubernetes.io/projected/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-kube-api-access-tnjqj\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.264664 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-run-httpd\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.264819 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.264938 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-scripts\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.265142 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-log-httpd\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.265291 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-config-data\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.265399 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.367876 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-log-httpd\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.368001 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-config-data\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.368055 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.368170 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tnjqj\" (UniqueName: \"kubernetes.io/projected/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-kube-api-access-tnjqj\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.368205 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-run-httpd\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.368269 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.368324 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-scripts\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.368687 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-log-httpd\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.369137 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-run-httpd\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.373188 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.373748 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-scripts\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.377275 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-config-data\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.377891 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.391773 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tnjqj\" (UniqueName: \"kubernetes.io/projected/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-kube-api-access-tnjqj\") pod \"ceilometer-0\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.443644 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:25 crc kubenswrapper[4765]: I0319 10:43:25.920540 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:26 crc kubenswrapper[4765]: I0319 10:43:26.066471 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerStarted","Data":"cb325346cdc1417ef37ac90ce95f3a5ab8088a09a05964b5b8c0269c2c01a398"} Mar 19 10:43:26 crc kubenswrapper[4765]: I0319 10:43:26.368235 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc731af7-c5a0-4d4e-9f33-9deec0f322ee" path="/var/lib/kubelet/pods/fc731af7-c5a0-4d4e-9f33-9deec0f322ee/volumes" Mar 19 10:43:27 crc kubenswrapper[4765]: I0319 10:43:27.077919 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerStarted","Data":"35017f4b28567434d5a7365d0acc8138f4fe28cd23ffc3b505a53cf519cf71f9"} Mar 19 10:43:27 crc kubenswrapper[4765]: I0319 10:43:27.615933 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:43:28 crc kubenswrapper[4765]: I0319 10:43:28.088261 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerStarted","Data":"de7c5f366d67ff6daf17d6353f3ea686deb394926d0e0300eca17c370e527207"} Mar 19 10:43:28 crc kubenswrapper[4765]: I0319 10:43:28.438309 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 10:43:28 crc kubenswrapper[4765]: I0319 10:43:28.539450 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:43:28 crc kubenswrapper[4765]: I0319 10:43:28.607109 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-nvlgk"] Mar 19 10:43:28 crc kubenswrapper[4765]: I0319 10:43:28.607371 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" podUID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerName="dnsmasq-dns" containerID="cri-o://85fece65b39d9fceec6e98c184220bd7dff7c03217c3ee3d8fc7ee66dd43aa4e" gracePeriod=10 Mar 19 10:43:28 crc kubenswrapper[4765]: I0319 10:43:28.851441 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.111294 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerStarted","Data":"cad56476ad5b846839baa9d8083c1e920e626f18eed117f98b7234f4d1eb4095"} Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.138923 4765 generic.go:334] "Generic (PLEG): container finished" podID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerID="85fece65b39d9fceec6e98c184220bd7dff7c03217c3ee3d8fc7ee66dd43aa4e" exitCode=0 Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.140208 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" event={"ID":"cb60c064-bde6-44f0-bc52-0da1205a7561","Type":"ContainerDied","Data":"85fece65b39d9fceec6e98c184220bd7dff7c03217c3ee3d8fc7ee66dd43aa4e"} Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.208201 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.337648 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.454856 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-nb\") pod \"cb60c064-bde6-44f0-bc52-0da1205a7561\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.455163 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-config\") pod \"cb60c064-bde6-44f0-bc52-0da1205a7561\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.455296 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-swift-storage-0\") pod \"cb60c064-bde6-44f0-bc52-0da1205a7561\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.455510 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-svc\") pod \"cb60c064-bde6-44f0-bc52-0da1205a7561\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.455608 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-sb\") pod \"cb60c064-bde6-44f0-bc52-0da1205a7561\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.455680 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l2xfd\" (UniqueName: \"kubernetes.io/projected/cb60c064-bde6-44f0-bc52-0da1205a7561-kube-api-access-l2xfd\") pod \"cb60c064-bde6-44f0-bc52-0da1205a7561\" (UID: \"cb60c064-bde6-44f0-bc52-0da1205a7561\") " Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.468156 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cb60c064-bde6-44f0-bc52-0da1205a7561-kube-api-access-l2xfd" (OuterVolumeSpecName: "kube-api-access-l2xfd") pod "cb60c064-bde6-44f0-bc52-0da1205a7561" (UID: "cb60c064-bde6-44f0-bc52-0da1205a7561"). InnerVolumeSpecName "kube-api-access-l2xfd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.509578 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "cb60c064-bde6-44f0-bc52-0da1205a7561" (UID: "cb60c064-bde6-44f0-bc52-0da1205a7561"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.542940 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "cb60c064-bde6-44f0-bc52-0da1205a7561" (UID: "cb60c064-bde6-44f0-bc52-0da1205a7561"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.558158 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.558197 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l2xfd\" (UniqueName: \"kubernetes.io/projected/cb60c064-bde6-44f0-bc52-0da1205a7561-kube-api-access-l2xfd\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.558238 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.567524 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "cb60c064-bde6-44f0-bc52-0da1205a7561" (UID: "cb60c064-bde6-44f0-bc52-0da1205a7561"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.573674 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-config" (OuterVolumeSpecName: "config") pod "cb60c064-bde6-44f0-bc52-0da1205a7561" (UID: "cb60c064-bde6-44f0-bc52-0da1205a7561"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.593660 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "cb60c064-bde6-44f0-bc52-0da1205a7561" (UID: "cb60c064-bde6-44f0-bc52-0da1205a7561"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.659951 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.660010 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.660026 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/cb60c064-bde6-44f0-bc52-0da1205a7561-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.789815 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:43:29 crc kubenswrapper[4765]: I0319 10:43:29.900672 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.083678 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/horizon-6c6ff5646d-fmdz2" Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.150111 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.150259 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-56df8fb6b7-nvlgk" event={"ID":"cb60c064-bde6-44f0-bc52-0da1205a7561","Type":"ContainerDied","Data":"ab11f5eeff91ed252a3c4753f59e53435c14f7f52a70a3098fbd1036ff609d03"} Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.150297 4765 scope.go:117] "RemoveContainer" containerID="85fece65b39d9fceec6e98c184220bd7dff7c03217c3ee3d8fc7ee66dd43aa4e" Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.150476 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="cinder-scheduler" containerID="cri-o://e31d9a59a0246db0d2c4c7fa1c458e96d17a7772f13ed898afffd6e139750ceb" gracePeriod=30 Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.150565 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/cinder-scheduler-0" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="probe" containerID="cri-o://ce3894d5acdf42457cd08ecc4a82f25a59317a8f34cbe51394e3aaf3524c56d1" gracePeriod=30 Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.159318 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c6bdcb6fb-89kxv"] Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.159559 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c6bdcb6fb-89kxv" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon-log" containerID="cri-o://9f412968986b7556b9d0cd9de4886ebe503ad2f5c2b7c5677168459667cc0902" gracePeriod=30 Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.159697 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/horizon-5c6bdcb6fb-89kxv" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" containerID="cri-o://1a7ad5eca76b21850fa11fd220a31f1fb2463a805be4f0068170a81bfea7d086" gracePeriod=30 Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.215039 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-nvlgk"] Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.233643 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-56df8fb6b7-nvlgk"] Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.310026 4765 scope.go:117] "RemoveContainer" containerID="1412cf4a5fbc04b01e4c26a9f36eb1e8fec5df7dee65c94e7086b0c7b5eecedd" Mar 19 10:43:30 crc kubenswrapper[4765]: I0319 10:43:30.375258 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb60c064-bde6-44f0-bc52-0da1205a7561" path="/var/lib/kubelet/pods/cb60c064-bde6-44f0-bc52-0da1205a7561/volumes" Mar 19 10:43:31 crc kubenswrapper[4765]: E0319 10:43:31.134613 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pode45ebb9b_6922_446d_a35d_b659217c5ef9.slice/crio-conmon-ce3894d5acdf42457cd08ecc4a82f25a59317a8f34cbe51394e3aaf3524c56d1.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.163985 4765 generic.go:334] "Generic (PLEG): container finished" podID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerID="ce3894d5acdf42457cd08ecc4a82f25a59317a8f34cbe51394e3aaf3524c56d1" exitCode=0 Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.164048 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e45ebb9b-6922-446d-a35d-b659217c5ef9","Type":"ContainerDied","Data":"ce3894d5acdf42457cd08ecc4a82f25a59317a8f34cbe51394e3aaf3524c56d1"} Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.623809 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.661736 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/barbican-api-77865d778-4kfkp" Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.735290 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-74c69ffcbd-fntb4"] Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.735544 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-74c69ffcbd-fntb4" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api-log" containerID="cri-o://70759ac3af0142b361b7b7eb3782cae1528baa4e99b09eb8e979837547f4d818" gracePeriod=30 Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.736067 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/barbican-api-74c69ffcbd-fntb4" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api" containerID="cri-o://702e682067397a234f29ad5a259464a4313162edb94817bf35554432126fcc98" gracePeriod=30 Mar 19 10:43:31 crc kubenswrapper[4765]: I0319 10:43:31.968563 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/keystone-79688b6ffc-lc92w" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.208519 4765 generic.go:334] "Generic (PLEG): container finished" podID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerID="e31d9a59a0246db0d2c4c7fa1c458e96d17a7772f13ed898afffd6e139750ceb" exitCode=0 Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.208807 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e45ebb9b-6922-446d-a35d-b659217c5ef9","Type":"ContainerDied","Data":"e31d9a59a0246db0d2c4c7fa1c458e96d17a7772f13ed898afffd6e139750ceb"} Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.214407 4765 generic.go:334] "Generic (PLEG): container finished" podID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerID="70759ac3af0142b361b7b7eb3782cae1528baa4e99b09eb8e979837547f4d818" exitCode=143 Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.214616 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74c69ffcbd-fntb4" event={"ID":"bcf43de1-bffe-4810-bfb0-c6ff2c59020a","Type":"ContainerDied","Data":"70759ac3af0142b361b7b7eb3782cae1528baa4e99b09eb8e979837547f4d818"} Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.705129 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.823217 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lb77z\" (UniqueName: \"kubernetes.io/projected/e45ebb9b-6922-446d-a35d-b659217c5ef9-kube-api-access-lb77z\") pod \"e45ebb9b-6922-446d-a35d-b659217c5ef9\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.823365 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e45ebb9b-6922-446d-a35d-b659217c5ef9-etc-machine-id\") pod \"e45ebb9b-6922-446d-a35d-b659217c5ef9\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.823400 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-scripts\") pod \"e45ebb9b-6922-446d-a35d-b659217c5ef9\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.823498 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data-custom\") pod \"e45ebb9b-6922-446d-a35d-b659217c5ef9\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.823564 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-combined-ca-bundle\") pod \"e45ebb9b-6922-446d-a35d-b659217c5ef9\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.823630 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data\") pod \"e45ebb9b-6922-446d-a35d-b659217c5ef9\" (UID: \"e45ebb9b-6922-446d-a35d-b659217c5ef9\") " Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.829335 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e45ebb9b-6922-446d-a35d-b659217c5ef9-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "e45ebb9b-6922-446d-a35d-b659217c5ef9" (UID: "e45ebb9b-6922-446d-a35d-b659217c5ef9"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.833936 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e45ebb9b-6922-446d-a35d-b659217c5ef9-kube-api-access-lb77z" (OuterVolumeSpecName: "kube-api-access-lb77z") pod "e45ebb9b-6922-446d-a35d-b659217c5ef9" (UID: "e45ebb9b-6922-446d-a35d-b659217c5ef9"). InnerVolumeSpecName "kube-api-access-lb77z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.834094 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-scripts" (OuterVolumeSpecName: "scripts") pod "e45ebb9b-6922-446d-a35d-b659217c5ef9" (UID: "e45ebb9b-6922-446d-a35d-b659217c5ef9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.839242 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "e45ebb9b-6922-446d-a35d-b659217c5ef9" (UID: "e45ebb9b-6922-446d-a35d-b659217c5ef9"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.896087 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "e45ebb9b-6922-446d-a35d-b659217c5ef9" (UID: "e45ebb9b-6922-446d-a35d-b659217c5ef9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.926106 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.926146 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.926158 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lb77z\" (UniqueName: \"kubernetes.io/projected/e45ebb9b-6922-446d-a35d-b659217c5ef9-kube-api-access-lb77z\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.926172 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/e45ebb9b-6922-446d-a35d-b659217c5ef9-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.926183 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:32 crc kubenswrapper[4765]: I0319 10:43:32.951163 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data" (OuterVolumeSpecName: "config-data") pod "e45ebb9b-6922-446d-a35d-b659217c5ef9" (UID: "e45ebb9b-6922-446d-a35d-b659217c5ef9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.020525 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.027793 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e45ebb9b-6922-446d-a35d-b659217c5ef9-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.105767 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/placement-54bc4cb6bd-w8bvw" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.174706 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78fbf58cd4-88cvd"] Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.175247 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78fbf58cd4-88cvd" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-log" containerID="cri-o://6c2171aeedc98b01a345cc71afeb8471981c231139f7a7849d287fef6f6b3565" gracePeriod=30 Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.175885 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/placement-78fbf58cd4-88cvd" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-api" containerID="cri-o://009916405b79f186f13e7105fc31b9e99265153221baab7a5bbf67e6ce9b6cd6" gracePeriod=30 Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.227125 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerStarted","Data":"7c87532bb51573b7b5457433c5fdf1d8df5eab8acce3a55904cfafe7b1b8e55f"} Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.228714 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.231350 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.231428 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"e45ebb9b-6922-446d-a35d-b659217c5ef9","Type":"ContainerDied","Data":"e46892dc8e6a42351183ba5a1f7f97989e6b99f77ddb96d220f23d87c8b3e6f5"} Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.231454 4765 scope.go:117] "RemoveContainer" containerID="ce3894d5acdf42457cd08ecc4a82f25a59317a8f34cbe51394e3aaf3524c56d1" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.287201 4765 scope.go:117] "RemoveContainer" containerID="e31d9a59a0246db0d2c4c7fa1c458e96d17a7772f13ed898afffd6e139750ceb" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.314828 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.2565888 podStartE2EDuration="8.314803176s" podCreationTimestamp="2026-03-19 10:43:25 +0000 UTC" firstStartedPulling="2026-03-19 10:43:25.933037639 +0000 UTC m=+1304.281983181" lastFinishedPulling="2026-03-19 10:43:31.991252015 +0000 UTC m=+1310.340197557" observedRunningTime="2026-03-19 10:43:33.264855725 +0000 UTC m=+1311.613801267" watchObservedRunningTime="2026-03-19 10:43:33.314803176 +0000 UTC m=+1311.663748708" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.377501 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.438582 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458133 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:33 crc kubenswrapper[4765]: E0319 10:43:33.458516 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerName="dnsmasq-dns" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458529 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerName="dnsmasq-dns" Mar 19 10:43:33 crc kubenswrapper[4765]: E0319 10:43:33.458560 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="probe" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458566 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="probe" Mar 19 10:43:33 crc kubenswrapper[4765]: E0319 10:43:33.458582 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerName="init" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458588 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerName="init" Mar 19 10:43:33 crc kubenswrapper[4765]: E0319 10:43:33.458600 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="cinder-scheduler" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458606 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="cinder-scheduler" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458785 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="probe" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458800 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" containerName="cinder-scheduler" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.458809 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="cb60c064-bde6-44f0-bc52-0da1205a7561" containerName="dnsmasq-dns" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.459801 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.464599 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-scheduler-config-data" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.500420 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.552314 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.552643 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da64a060-18bb-4b34-9374-1fec5ad88ede-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.552780 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.552830 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-config-data\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.552877 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-scripts\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.553011 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h24cd\" (UniqueName: \"kubernetes.io/projected/da64a060-18bb-4b34-9374-1fec5ad88ede-kube-api-access-h24cd\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.654419 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.654466 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da64a060-18bb-4b34-9374-1fec5ad88ede-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.654534 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.654553 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-config-data\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.654595 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-scripts\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.654642 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h24cd\" (UniqueName: \"kubernetes.io/projected/da64a060-18bb-4b34-9374-1fec5ad88ede-kube-api-access-h24cd\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.654937 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/da64a060-18bb-4b34-9374-1fec5ad88ede-etc-machine-id\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.665686 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-combined-ca-bundle\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.666110 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-config-data\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.669775 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-scripts\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.678605 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/da64a060-18bb-4b34-9374-1fec5ad88ede-config-data-custom\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.679161 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h24cd\" (UniqueName: \"kubernetes.io/projected/da64a060-18bb-4b34-9374-1fec5ad88ede-kube-api-access-h24cd\") pod \"cinder-scheduler-0\" (UID: \"da64a060-18bb-4b34-9374-1fec5ad88ede\") " pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.793450 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-scheduler-0" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.827350 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/cinder-api-0" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api" probeResult="failure" output="Get \"http://10.217.0.170:8776/healthcheck\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.859207 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/openstackclient"] Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.860383 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.870665 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.875738 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-config-secret" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.875985 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-config" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.876141 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstackclient-openstackclient-dockercfg-wfvp8" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.962002 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cda8252-a988-49d1-a566-8d9989b86034-openstack-config\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.962124 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda8252-a988-49d1-a566-8d9989b86034-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.962172 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x7x2\" (UniqueName: \"kubernetes.io/projected/1cda8252-a988-49d1-a566-8d9989b86034-kube-api-access-8x7x2\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:33 crc kubenswrapper[4765]: I0319 10:43:33.962211 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cda8252-a988-49d1-a566-8d9989b86034-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.064337 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda8252-a988-49d1-a566-8d9989b86034-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.064412 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8x7x2\" (UniqueName: \"kubernetes.io/projected/1cda8252-a988-49d1-a566-8d9989b86034-kube-api-access-8x7x2\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.064452 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cda8252-a988-49d1-a566-8d9989b86034-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.064563 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cda8252-a988-49d1-a566-8d9989b86034-openstack-config\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.070433 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/1cda8252-a988-49d1-a566-8d9989b86034-openstack-config\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.111742 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x7x2\" (UniqueName: \"kubernetes.io/projected/1cda8252-a988-49d1-a566-8d9989b86034-kube-api-access-8x7x2\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.115729 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1cda8252-a988-49d1-a566-8d9989b86034-combined-ca-bundle\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.128411 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/1cda8252-a988-49d1-a566-8d9989b86034-openstack-config-secret\") pod \"openstackclient\" (UID: \"1cda8252-a988-49d1-a566-8d9989b86034\") " pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.196879 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/openstackclient" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.279877 4765 generic.go:334] "Generic (PLEG): container finished" podID="83bf4704-916f-4b97-804c-c64d00158bc5" containerID="6c2171aeedc98b01a345cc71afeb8471981c231139f7a7849d287fef6f6b3565" exitCode=143 Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.279937 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78fbf58cd4-88cvd" event={"ID":"83bf4704-916f-4b97-804c-c64d00158bc5","Type":"ContainerDied","Data":"6c2171aeedc98b01a345cc71afeb8471981c231139f7a7849d287fef6f6b3565"} Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.285002 4765 generic.go:334] "Generic (PLEG): container finished" podID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerID="1a7ad5eca76b21850fa11fd220a31f1fb2463a805be4f0068170a81bfea7d086" exitCode=0 Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.285084 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6bdcb6fb-89kxv" event={"ID":"5112f66b-28fa-4500-b77b-351b8c3d0519","Type":"ContainerDied","Data":"1a7ad5eca76b21850fa11fd220a31f1fb2463a805be4f0068170a81bfea7d086"} Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.380983 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e45ebb9b-6922-446d-a35d-b659217c5ef9" path="/var/lib/kubelet/pods/e45ebb9b-6922-446d-a35d-b659217c5ef9/volumes" Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.422943 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-scheduler-0"] Mar 19 10:43:34 crc kubenswrapper[4765]: I0319 10:43:34.723602 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/openstackclient"] Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.071833 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5c6bdcb6fb-89kxv" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.178723 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-74c69ffcbd-fntb4" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api-log" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:42560->10.217.0.163:9311: read: connection reset by peer" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.180468 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/barbican-api-74c69ffcbd-fntb4" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api" probeResult="failure" output="Get \"http://10.217.0.163:9311/healthcheck\": read tcp 10.217.0.2:42562->10.217.0.163:9311: read: connection reset by peer" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.319597 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1cda8252-a988-49d1-a566-8d9989b86034","Type":"ContainerStarted","Data":"148060afa933e9164d7174971e24e87e13878478a697ed62ad3ffc53968962ab"} Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.322275 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da64a060-18bb-4b34-9374-1fec5ad88ede","Type":"ContainerStarted","Data":"1f7f12b4da1a4892f4b20866494ff5c27a0c2f888412ec49b8b39213a4b8dae4"} Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.322323 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da64a060-18bb-4b34-9374-1fec5ad88ede","Type":"ContainerStarted","Data":"345e0181b3b5e2e0e3a3de3419aefe4b3a23d57c60e6aef5a0e307a9294971dc"} Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.326525 4765 generic.go:334] "Generic (PLEG): container finished" podID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerID="702e682067397a234f29ad5a259464a4313162edb94817bf35554432126fcc98" exitCode=0 Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.326741 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74c69ffcbd-fntb4" event={"ID":"bcf43de1-bffe-4810-bfb0-c6ff2c59020a","Type":"ContainerDied","Data":"702e682067397a234f29ad5a259464a4313162edb94817bf35554432126fcc98"} Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.748667 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.808734 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkg5z\" (UniqueName: \"kubernetes.io/projected/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-kube-api-access-dkg5z\") pod \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.808798 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data-custom\") pod \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.808830 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-combined-ca-bundle\") pod \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.809234 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-logs\") pod \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.809330 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data\") pod \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\" (UID: \"bcf43de1-bffe-4810-bfb0-c6ff2c59020a\") " Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.809704 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-logs" (OuterVolumeSpecName: "logs") pod "bcf43de1-bffe-4810-bfb0-c6ff2c59020a" (UID: "bcf43de1-bffe-4810-bfb0-c6ff2c59020a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.810212 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.819022 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-kube-api-access-dkg5z" (OuterVolumeSpecName: "kube-api-access-dkg5z") pod "bcf43de1-bffe-4810-bfb0-c6ff2c59020a" (UID: "bcf43de1-bffe-4810-bfb0-c6ff2c59020a"). InnerVolumeSpecName "kube-api-access-dkg5z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.821685 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "bcf43de1-bffe-4810-bfb0-c6ff2c59020a" (UID: "bcf43de1-bffe-4810-bfb0-c6ff2c59020a"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.850228 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bcf43de1-bffe-4810-bfb0-c6ff2c59020a" (UID: "bcf43de1-bffe-4810-bfb0-c6ff2c59020a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.887778 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data" (OuterVolumeSpecName: "config-data") pod "bcf43de1-bffe-4810-bfb0-c6ff2c59020a" (UID: "bcf43de1-bffe-4810-bfb0-c6ff2c59020a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.911885 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkg5z\" (UniqueName: \"kubernetes.io/projected/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-kube-api-access-dkg5z\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.911925 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.911934 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:35 crc kubenswrapper[4765]: I0319 10:43:35.911944 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bcf43de1-bffe-4810-bfb0-c6ff2c59020a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.346594 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-scheduler-0" event={"ID":"da64a060-18bb-4b34-9374-1fec5ad88ede","Type":"ContainerStarted","Data":"e5a97053cddf41013d5dd60180eb178f0519256c58fd122d63abfd12a02adae0"} Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.354780 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/barbican-api-74c69ffcbd-fntb4" event={"ID":"bcf43de1-bffe-4810-bfb0-c6ff2c59020a","Type":"ContainerDied","Data":"f9fb5e8cf7da43f95d51a64857313918e6f0174334346da149cffff79fec459d"} Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.354826 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/barbican-api-74c69ffcbd-fntb4" Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.354849 4765 scope.go:117] "RemoveContainer" containerID="702e682067397a234f29ad5a259464a4313162edb94817bf35554432126fcc98" Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.371904 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-scheduler-0" podStartSLOduration=3.371881298 podStartE2EDuration="3.371881298s" podCreationTimestamp="2026-03-19 10:43:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:36.37054097 +0000 UTC m=+1314.719486532" watchObservedRunningTime="2026-03-19 10:43:36.371881298 +0000 UTC m=+1314.720826830" Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.425647 4765 scope.go:117] "RemoveContainer" containerID="70759ac3af0142b361b7b7eb3782cae1528baa4e99b09eb8e979837547f4d818" Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.436538 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-api-74c69ffcbd-fntb4"] Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.445475 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-api-74c69ffcbd-fntb4"] Mar 19 10:43:36 crc kubenswrapper[4765]: I0319 10:43:36.608181 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.366676 4765 generic.go:334] "Generic (PLEG): container finished" podID="83bf4704-916f-4b97-804c-c64d00158bc5" containerID="009916405b79f186f13e7105fc31b9e99265153221baab7a5bbf67e6ce9b6cd6" exitCode=0 Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.366760 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78fbf58cd4-88cvd" event={"ID":"83bf4704-916f-4b97-804c-c64d00158bc5","Type":"ContainerDied","Data":"009916405b79f186f13e7105fc31b9e99265153221baab7a5bbf67e6ce9b6cd6"} Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.736602 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.757924 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-combined-ca-bundle\") pod \"83bf4704-916f-4b97-804c-c64d00158bc5\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.758021 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-config-data\") pod \"83bf4704-916f-4b97-804c-c64d00158bc5\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.758163 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s285x\" (UniqueName: \"kubernetes.io/projected/83bf4704-916f-4b97-804c-c64d00158bc5-kube-api-access-s285x\") pod \"83bf4704-916f-4b97-804c-c64d00158bc5\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.758263 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-public-tls-certs\") pod \"83bf4704-916f-4b97-804c-c64d00158bc5\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.758335 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-scripts\") pod \"83bf4704-916f-4b97-804c-c64d00158bc5\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.758383 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf4704-916f-4b97-804c-c64d00158bc5-logs\") pod \"83bf4704-916f-4b97-804c-c64d00158bc5\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.758409 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-internal-tls-certs\") pod \"83bf4704-916f-4b97-804c-c64d00158bc5\" (UID: \"83bf4704-916f-4b97-804c-c64d00158bc5\") " Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.759686 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83bf4704-916f-4b97-804c-c64d00158bc5-logs" (OuterVolumeSpecName: "logs") pod "83bf4704-916f-4b97-804c-c64d00158bc5" (UID: "83bf4704-916f-4b97-804c-c64d00158bc5"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.760809 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83bf4704-916f-4b97-804c-c64d00158bc5-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.766120 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83bf4704-916f-4b97-804c-c64d00158bc5-kube-api-access-s285x" (OuterVolumeSpecName: "kube-api-access-s285x") pod "83bf4704-916f-4b97-804c-c64d00158bc5" (UID: "83bf4704-916f-4b97-804c-c64d00158bc5"). InnerVolumeSpecName "kube-api-access-s285x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.794475 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-scripts" (OuterVolumeSpecName: "scripts") pod "83bf4704-916f-4b97-804c-c64d00158bc5" (UID: "83bf4704-916f-4b97-804c-c64d00158bc5"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.854841 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-config-data" (OuterVolumeSpecName: "config-data") pod "83bf4704-916f-4b97-804c-c64d00158bc5" (UID: "83bf4704-916f-4b97-804c-c64d00158bc5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.857485 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83bf4704-916f-4b97-804c-c64d00158bc5" (UID: "83bf4704-916f-4b97-804c-c64d00158bc5"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.862930 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.862989 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.863002 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s285x\" (UniqueName: \"kubernetes.io/projected/83bf4704-916f-4b97-804c-c64d00158bc5-kube-api-access-s285x\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.863028 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.923831 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "83bf4704-916f-4b97-804c-c64d00158bc5" (UID: "83bf4704-916f-4b97-804c-c64d00158bc5"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.940494 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "83bf4704-916f-4b97-804c-c64d00158bc5" (UID: "83bf4704-916f-4b97-804c-c64d00158bc5"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.966085 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:37 crc kubenswrapper[4765]: I0319 10:43:37.966122 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/83bf4704-916f-4b97-804c-c64d00158bc5-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.369029 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" path="/var/lib/kubelet/pods/bcf43de1-bffe-4810-bfb0-c6ff2c59020a/volumes" Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.389067 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/placement-78fbf58cd4-88cvd" event={"ID":"83bf4704-916f-4b97-804c-c64d00158bc5","Type":"ContainerDied","Data":"35d100727d6206ae2b837126fdec2f26519ef810c93fd624c96860bbaf688e7c"} Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.389118 4765 scope.go:117] "RemoveContainer" containerID="009916405b79f186f13e7105fc31b9e99265153221baab7a5bbf67e6ce9b6cd6" Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.389222 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/placement-78fbf58cd4-88cvd" Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.414826 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-78fbf58cd4-88cvd"] Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.425140 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-78fbf58cd4-88cvd"] Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.434030 4765 scope.go:117] "RemoveContainer" containerID="6c2171aeedc98b01a345cc71afeb8471981c231139f7a7849d287fef6f6b3565" Mar 19 10:43:38 crc kubenswrapper[4765]: I0319 10:43:38.793838 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/cinder-scheduler-0" Mar 19 10:43:39 crc kubenswrapper[4765]: I0319 10:43:39.846238 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:43:39 crc kubenswrapper[4765]: I0319 10:43:39.846679 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-log" containerID="cri-o://a2c13d1a9d7fc306c80cb91654ba453ad5b1dc4a52e34f7c0e29a95aed9cdd2d" gracePeriod=30 Mar 19 10:43:39 crc kubenswrapper[4765]: I0319 10:43:39.847049 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-internal-api-0" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-httpd" containerID="cri-o://21835277e1595cab353343771c2d49e7872ba78270008da1c8865836e3c549b7" gracePeriod=30 Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.133833 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.369401 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" path="/var/lib/kubelet/pods/83bf4704-916f-4b97-804c-c64d00158bc5/volumes" Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.446474 4765 generic.go:334] "Generic (PLEG): container finished" podID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerID="a2c13d1a9d7fc306c80cb91654ba453ad5b1dc4a52e34f7c0e29a95aed9cdd2d" exitCode=143 Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.446525 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a083bcfd-87a7-43f7-b0a3-1180bea648b3","Type":"ContainerDied","Data":"a2c13d1a9d7fc306c80cb91654ba453ad5b1dc4a52e34f7c0e29a95aed9cdd2d"} Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.964920 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.965215 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="sg-core" containerID="cri-o://cad56476ad5b846839baa9d8083c1e920e626f18eed117f98b7234f4d1eb4095" gracePeriod=30 Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.965274 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-notification-agent" containerID="cri-o://de7c5f366d67ff6daf17d6353f3ea686deb394926d0e0300eca17c370e527207" gracePeriod=30 Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.965242 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="proxy-httpd" containerID="cri-o://7c87532bb51573b7b5457433c5fdf1d8df5eab8acce3a55904cfafe7b1b8e55f" gracePeriod=30 Mar 19 10:43:40 crc kubenswrapper[4765]: I0319 10:43:40.965368 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-central-agent" containerID="cri-o://35017f4b28567434d5a7365d0acc8138f4fe28cd23ffc3b505a53cf519cf71f9" gracePeriod=30 Mar 19 10:43:41 crc kubenswrapper[4765]: E0319 10:43:41.380904 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bfdc94f_3c5d_47de_8d4a_59d804b9b68e.slice/crio-35017f4b28567434d5a7365d0acc8138f4fe28cd23ffc3b505a53cf519cf71f9.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:43:41 crc kubenswrapper[4765]: I0319 10:43:41.457222 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerID="7c87532bb51573b7b5457433c5fdf1d8df5eab8acce3a55904cfafe7b1b8e55f" exitCode=0 Mar 19 10:43:41 crc kubenswrapper[4765]: I0319 10:43:41.457262 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerID="cad56476ad5b846839baa9d8083c1e920e626f18eed117f98b7234f4d1eb4095" exitCode=2 Mar 19 10:43:41 crc kubenswrapper[4765]: I0319 10:43:41.457273 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerID="35017f4b28567434d5a7365d0acc8138f4fe28cd23ffc3b505a53cf519cf71f9" exitCode=0 Mar 19 10:43:41 crc kubenswrapper[4765]: I0319 10:43:41.457293 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerDied","Data":"7c87532bb51573b7b5457433c5fdf1d8df5eab8acce3a55904cfafe7b1b8e55f"} Mar 19 10:43:41 crc kubenswrapper[4765]: I0319 10:43:41.457336 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerDied","Data":"cad56476ad5b846839baa9d8083c1e920e626f18eed117f98b7234f4d1eb4095"} Mar 19 10:43:41 crc kubenswrapper[4765]: I0319 10:43:41.457346 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerDied","Data":"35017f4b28567434d5a7365d0acc8138f4fe28cd23ffc3b505a53cf519cf71f9"} Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.082669 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/swift-proxy-556979b4dc-zj26d"] Mar 19 10:43:42 crc kubenswrapper[4765]: E0319 10:43:42.083040 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-log" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083052 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-log" Mar 19 10:43:42 crc kubenswrapper[4765]: E0319 10:43:42.083064 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083070 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api" Mar 19 10:43:42 crc kubenswrapper[4765]: E0319 10:43:42.083094 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api-log" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083101 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api-log" Mar 19 10:43:42 crc kubenswrapper[4765]: E0319 10:43:42.083112 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-api" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083118 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-api" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083312 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083322 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-log" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083331 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bcf43de1-bffe-4810-bfb0-c6ff2c59020a" containerName="barbican-api-log" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.083344 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="83bf4704-916f-4b97-804c-c64d00158bc5" containerName="placement-api" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.084225 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.086516 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-internal-svc" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.086703 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-swift-public-svc" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.089336 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"swift-proxy-config-data" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.105062 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-556979b4dc-zj26d"] Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177452 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00e0de39-87cf-4a6e-8980-a294f329e430-etc-swift\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177504 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00e0de39-87cf-4a6e-8980-a294f329e430-run-httpd\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177538 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-combined-ca-bundle\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177568 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00e0de39-87cf-4a6e-8980-a294f329e430-log-httpd\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177590 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-public-tls-certs\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177687 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-internal-tls-certs\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177728 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-config-data\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.177757 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n26jp\" (UniqueName: \"kubernetes.io/projected/00e0de39-87cf-4a6e-8980-a294f329e430-kube-api-access-n26jp\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.279854 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00e0de39-87cf-4a6e-8980-a294f329e430-etc-swift\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.279929 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00e0de39-87cf-4a6e-8980-a294f329e430-run-httpd\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.279981 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-combined-ca-bundle\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.280019 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00e0de39-87cf-4a6e-8980-a294f329e430-log-httpd\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.280050 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-public-tls-certs\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.280149 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-internal-tls-certs\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.280209 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-config-data\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.280251 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n26jp\" (UniqueName: \"kubernetes.io/projected/00e0de39-87cf-4a6e-8980-a294f329e430-kube-api-access-n26jp\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.280494 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00e0de39-87cf-4a6e-8980-a294f329e430-run-httpd\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.280603 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/00e0de39-87cf-4a6e-8980-a294f329e430-log-httpd\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.287120 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-combined-ca-bundle\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.287171 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-config-data\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.287374 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/00e0de39-87cf-4a6e-8980-a294f329e430-etc-swift\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.287942 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-internal-tls-certs\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.291289 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/00e0de39-87cf-4a6e-8980-a294f329e430-public-tls-certs\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.298856 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n26jp\" (UniqueName: \"kubernetes.io/projected/00e0de39-87cf-4a6e-8980-a294f329e430-kube-api-access-n26jp\") pod \"swift-proxy-556979b4dc-zj26d\" (UID: \"00e0de39-87cf-4a6e-8980-a294f329e430\") " pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.406185 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.481684 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerID="de7c5f366d67ff6daf17d6353f3ea686deb394926d0e0300eca17c370e527207" exitCode=0 Mar 19 10:43:42 crc kubenswrapper[4765]: I0319 10:43:42.482011 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerDied","Data":"de7c5f366d67ff6daf17d6353f3ea686deb394926d0e0300eca17c370e527207"} Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.180177 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.180733 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-log" containerID="cri-o://b6ebe68d3a95bc401568b3ba8caa6bf11c12bccbcf18901317915931dd3c8093" gracePeriod=30 Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.181248 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/glance-default-external-api-0" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-httpd" containerID="cri-o://ecd8b3390445b80ae72265c157bbdc070db369ac9b9f3af9d421ae1b82ddbdb2" gracePeriod=30 Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.487867 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/neutron-67b57ccc79-wx8k9" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.501297 4765 generic.go:334] "Generic (PLEG): container finished" podID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerID="21835277e1595cab353343771c2d49e7872ba78270008da1c8865836e3c549b7" exitCode=0 Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.501373 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a083bcfd-87a7-43f7-b0a3-1180bea648b3","Type":"ContainerDied","Data":"21835277e1595cab353343771c2d49e7872ba78270008da1c8865836e3c549b7"} Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.504505 4765 generic.go:334] "Generic (PLEG): container finished" podID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerID="b6ebe68d3a95bc401568b3ba8caa6bf11c12bccbcf18901317915931dd3c8093" exitCode=143 Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.504548 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dd71512-2453-4dff-98d8-3cf981fbbb8f","Type":"ContainerDied","Data":"b6ebe68d3a95bc401568b3ba8caa6bf11c12bccbcf18901317915931dd3c8093"} Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.552493 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66fb7cb9f6-g7xpk"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.552750 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66fb7cb9f6-g7xpk" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-api" containerID="cri-o://402155193e5eb4df46bcd8d42397b51050e1b818f7e03209b35eab08249dd7aa" gracePeriod=30 Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.554320 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/neutron-66fb7cb9f6-g7xpk" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-httpd" containerID="cri-o://49b4eaf7fb3307781adb143ca4b6178181d60e6e330bab6ca84d5ec1f928af9b" gracePeriod=30 Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.770576 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-db-create-bg8xf"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.771926 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.782799 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bg8xf"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.830276 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjvj6\" (UniqueName: \"kubernetes.io/projected/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-kube-api-access-bjvj6\") pod \"nova-api-db-create-bg8xf\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.830360 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-operator-scripts\") pod \"nova-api-db-create-bg8xf\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.868581 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-db-create-7fsgp"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.869736 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.891453 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7fsgp"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.903138 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-4a7c-account-create-update-tbxvw"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.904445 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.907858 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-db-secret" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.928518 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4a7c-account-create-update-tbxvw"] Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.932814 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de70af7a-9885-40d1-868d-14c156308212-operator-scripts\") pod \"nova-cell0-db-create-7fsgp\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.932928 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k7gv\" (UniqueName: \"kubernetes.io/projected/97833765-fe7a-40eb-9764-180d2123e113-kube-api-access-6k7gv\") pod \"nova-api-4a7c-account-create-update-tbxvw\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.933008 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bjvj6\" (UniqueName: \"kubernetes.io/projected/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-kube-api-access-bjvj6\") pod \"nova-api-db-create-bg8xf\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.933057 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2bmks\" (UniqueName: \"kubernetes.io/projected/de70af7a-9885-40d1-868d-14c156308212-kube-api-access-2bmks\") pod \"nova-cell0-db-create-7fsgp\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.933095 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-operator-scripts\") pod \"nova-api-db-create-bg8xf\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.933200 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97833765-fe7a-40eb-9764-180d2123e113-operator-scripts\") pod \"nova-api-4a7c-account-create-update-tbxvw\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.934099 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-operator-scripts\") pod \"nova-api-db-create-bg8xf\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:43 crc kubenswrapper[4765]: I0319 10:43:43.961471 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjvj6\" (UniqueName: \"kubernetes.io/projected/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-kube-api-access-bjvj6\") pod \"nova-api-db-create-bg8xf\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.001020 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-db-create-bgz4n"] Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.002865 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.009933 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bgz4n"] Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.035541 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb1943e-d6aa-4223-8654-5e674a71b734-operator-scripts\") pod \"nova-cell1-db-create-bgz4n\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.035634 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de70af7a-9885-40d1-868d-14c156308212-operator-scripts\") pod \"nova-cell0-db-create-7fsgp\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.035707 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6k7gv\" (UniqueName: \"kubernetes.io/projected/97833765-fe7a-40eb-9764-180d2123e113-kube-api-access-6k7gv\") pod \"nova-api-4a7c-account-create-update-tbxvw\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.035768 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2bmks\" (UniqueName: \"kubernetes.io/projected/de70af7a-9885-40d1-868d-14c156308212-kube-api-access-2bmks\") pod \"nova-cell0-db-create-7fsgp\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.035856 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97833765-fe7a-40eb-9764-180d2123e113-operator-scripts\") pod \"nova-api-4a7c-account-create-update-tbxvw\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.035893 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hswp2\" (UniqueName: \"kubernetes.io/projected/1cb1943e-d6aa-4223-8654-5e674a71b734-kube-api-access-hswp2\") pod \"nova-cell1-db-create-bgz4n\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.037058 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de70af7a-9885-40d1-868d-14c156308212-operator-scripts\") pod \"nova-cell0-db-create-7fsgp\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.038674 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97833765-fe7a-40eb-9764-180d2123e113-operator-scripts\") pod \"nova-api-4a7c-account-create-update-tbxvw\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.073111 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2bmks\" (UniqueName: \"kubernetes.io/projected/de70af7a-9885-40d1-868d-14c156308212-kube-api-access-2bmks\") pod \"nova-cell0-db-create-7fsgp\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.086190 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k7gv\" (UniqueName: \"kubernetes.io/projected/97833765-fe7a-40eb-9764-180d2123e113-kube-api-access-6k7gv\") pod \"nova-api-4a7c-account-create-update-tbxvw\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.096664 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-31ef-account-create-update-kcvq5"] Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.097018 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.098290 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.104731 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-db-secret" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.128103 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-31ef-account-create-update-kcvq5"] Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.138262 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb1943e-d6aa-4223-8654-5e674a71b734-operator-scripts\") pod \"nova-cell1-db-create-bgz4n\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.138384 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qrfqp\" (UniqueName: \"kubernetes.io/projected/5de4629f-4496-4991-962f-4410df18a713-kube-api-access-qrfqp\") pod \"nova-cell0-31ef-account-create-update-kcvq5\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.138455 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de4629f-4496-4991-962f-4410df18a713-operator-scripts\") pod \"nova-cell0-31ef-account-create-update-kcvq5\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.138515 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hswp2\" (UniqueName: \"kubernetes.io/projected/1cb1943e-d6aa-4223-8654-5e674a71b734-kube-api-access-hswp2\") pod \"nova-cell1-db-create-bgz4n\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.139096 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb1943e-d6aa-4223-8654-5e674a71b734-operator-scripts\") pod \"nova-cell1-db-create-bgz4n\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.164738 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hswp2\" (UniqueName: \"kubernetes.io/projected/1cb1943e-d6aa-4223-8654-5e674a71b734-kube-api-access-hswp2\") pod \"nova-cell1-db-create-bgz4n\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.208576 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.225900 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.240652 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qrfqp\" (UniqueName: \"kubernetes.io/projected/5de4629f-4496-4991-962f-4410df18a713-kube-api-access-qrfqp\") pod \"nova-cell0-31ef-account-create-update-kcvq5\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.240751 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de4629f-4496-4991-962f-4410df18a713-operator-scripts\") pod \"nova-cell0-31ef-account-create-update-kcvq5\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.241635 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de4629f-4496-4991-962f-4410df18a713-operator-scripts\") pod \"nova-cell0-31ef-account-create-update-kcvq5\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.267664 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qrfqp\" (UniqueName: \"kubernetes.io/projected/5de4629f-4496-4991-962f-4410df18a713-kube-api-access-qrfqp\") pod \"nova-cell0-31ef-account-create-update-kcvq5\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.267983 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/cinder-scheduler-0" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.295127 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-e031-account-create-update-smc8t"] Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.296554 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.300930 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-db-secret" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.308788 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e031-account-create-update-smc8t"] Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.344218 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63fab428-8477-40cd-bd57-250471e0d108-operator-scripts\") pod \"nova-cell1-e031-account-create-update-smc8t\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.344316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m94rw\" (UniqueName: \"kubernetes.io/projected/63fab428-8477-40cd-bd57-250471e0d108-kube-api-access-m94rw\") pod \"nova-cell1-e031-account-create-update-smc8t\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.373729 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.447319 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63fab428-8477-40cd-bd57-250471e0d108-operator-scripts\") pod \"nova-cell1-e031-account-create-update-smc8t\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.447397 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m94rw\" (UniqueName: \"kubernetes.io/projected/63fab428-8477-40cd-bd57-250471e0d108-kube-api-access-m94rw\") pod \"nova-cell1-e031-account-create-update-smc8t\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.451136 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63fab428-8477-40cd-bd57-250471e0d108-operator-scripts\") pod \"nova-cell1-e031-account-create-update-smc8t\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.465109 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m94rw\" (UniqueName: \"kubernetes.io/projected/63fab428-8477-40cd-bd57-250471e0d108-kube-api-access-m94rw\") pod \"nova-cell1-e031-account-create-update-smc8t\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.517520 4765 generic.go:334] "Generic (PLEG): container finished" podID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerID="49b4eaf7fb3307781adb143ca4b6178181d60e6e330bab6ca84d5ec1f928af9b" exitCode=0 Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.517559 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb7cb9f6-g7xpk" event={"ID":"ab7915d2-c641-481f-a9f6-1ce1209c7e17","Type":"ContainerDied","Data":"49b4eaf7fb3307781adb143ca4b6178181d60e6e330bab6ca84d5ec1f928af9b"} Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.518161 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:44 crc kubenswrapper[4765]: I0319 10:43:44.629642 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:45 crc kubenswrapper[4765]: I0319 10:43:45.071896 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5c6bdcb6fb-89kxv" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 10:43:46 crc kubenswrapper[4765]: I0319 10:43:46.539442 4765 generic.go:334] "Generic (PLEG): container finished" podID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerID="402155193e5eb4df46bcd8d42397b51050e1b818f7e03209b35eab08249dd7aa" exitCode=0 Mar 19 10:43:46 crc kubenswrapper[4765]: I0319 10:43:46.539488 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb7cb9f6-g7xpk" event={"ID":"ab7915d2-c641-481f-a9f6-1ce1209c7e17","Type":"ContainerDied","Data":"402155193e5eb4df46bcd8d42397b51050e1b818f7e03209b35eab08249dd7aa"} Mar 19 10:43:47 crc kubenswrapper[4765]: I0319 10:43:47.312490 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": dial tcp 10.217.0.155:9292: connect: connection refused" Mar 19 10:43:47 crc kubenswrapper[4765]: I0319 10:43:47.312490 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-internal-api-0" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.155:9292/healthcheck\": dial tcp 10.217.0.155:9292: connect: connection refused" Mar 19 10:43:47 crc kubenswrapper[4765]: I0319 10:43:47.549419 4765 generic.go:334] "Generic (PLEG): container finished" podID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerID="ecd8b3390445b80ae72265c157bbdc070db369ac9b9f3af9d421ae1b82ddbdb2" exitCode=0 Mar 19 10:43:47 crc kubenswrapper[4765]: I0319 10:43:47.549479 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dd71512-2453-4dff-98d8-3cf981fbbb8f","Type":"ContainerDied","Data":"ecd8b3390445b80ae72265c157bbdc070db369ac9b9f3af9d421ae1b82ddbdb2"} Mar 19 10:43:47 crc kubenswrapper[4765]: I0319 10:43:47.620003 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-httpd" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": dial tcp 10.217.0.156:9292: connect: connection refused" Mar 19 10:43:47 crc kubenswrapper[4765]: I0319 10:43:47.620084 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/glance-default-external-api-0" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-log" probeResult="failure" output="Get \"https://10.217.0.156:9292/healthcheck\": dial tcp 10.217.0.156:9292: connect: connection refused" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.381546 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.433232 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-logs\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.433293 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-httpd-run\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.433408 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-scripts\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.433940 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.434068 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-config-data\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.434103 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-combined-ca-bundle\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.434123 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk68w\" (UniqueName: \"kubernetes.io/projected/a083bcfd-87a7-43f7-b0a3-1180bea648b3-kube-api-access-xk68w\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.434141 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.434260 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-internal-tls-certs\") pod \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\" (UID: \"a083bcfd-87a7-43f7-b0a3-1180bea648b3\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.434064 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-logs" (OuterVolumeSpecName: "logs") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.438812 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.438841 4765 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a083bcfd-87a7-43f7-b0a3-1180bea648b3-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.439043 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.439721 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a083bcfd-87a7-43f7-b0a3-1180bea648b3-kube-api-access-xk68w" (OuterVolumeSpecName: "kube-api-access-xk68w") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "kube-api-access-xk68w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.445273 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-scripts" (OuterVolumeSpecName: "scripts") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.471038 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.531287 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-config-data" (OuterVolumeSpecName: "config-data") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.540722 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.540754 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.540766 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.540778 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk68w\" (UniqueName: \"kubernetes.io/projected/a083bcfd-87a7-43f7-b0a3-1180bea648b3-kube-api-access-xk68w\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.540807 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.578633 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "a083bcfd-87a7-43f7-b0a3-1180bea648b3" (UID: "a083bcfd-87a7-43f7-b0a3-1180bea648b3"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.584217 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"a083bcfd-87a7-43f7-b0a3-1180bea648b3","Type":"ContainerDied","Data":"6758a99c044d1524a047506a22b3b0dc8a85c0c3e416de94b3f07f50e8f83a3f"} Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.584286 4765 scope.go:117] "RemoveContainer" containerID="21835277e1595cab353343771c2d49e7872ba78270008da1c8865836e3c549b7" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.584434 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.648802 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.650042 4765 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.662653 4765 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.662702 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/a083bcfd-87a7-43f7-b0a3-1180bea648b3-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.668734 4765 scope.go:117] "RemoveContainer" containerID="a2c13d1a9d7fc306c80cb91654ba453ad5b1dc4a52e34f7c0e29a95aed9cdd2d" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.695176 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.712821 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.737393 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:43:48 crc kubenswrapper[4765]: E0319 10:43:48.737836 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-log" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.737852 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-log" Mar 19 10:43:48 crc kubenswrapper[4765]: E0319 10:43:48.737872 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-log" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.737880 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-log" Mar 19 10:43:48 crc kubenswrapper[4765]: E0319 10:43:48.737897 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-httpd" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.737904 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-httpd" Mar 19 10:43:48 crc kubenswrapper[4765]: E0319 10:43:48.737916 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-httpd" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.737923 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-httpd" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.742400 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-log" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.742435 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" containerName="glance-httpd" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.742448 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-log" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.742471 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" containerName="glance-httpd" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.743672 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.754708 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-internal-svc" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.755164 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-internal-config-data" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.763594 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-logs\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.764106 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-config-data\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.764273 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-httpd-run\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.764453 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-public-tls-certs\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.768301 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-combined-ca-bundle\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.768427 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.768543 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-scripts\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.768633 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4szlr\" (UniqueName: \"kubernetes.io/projected/8dd71512-2453-4dff-98d8-3cf981fbbb8f-kube-api-access-4szlr\") pod \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\" (UID: \"8dd71512-2453-4dff-98d8-3cf981fbbb8f\") " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.771525 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.775424 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-logs" (OuterVolumeSpecName: "logs") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.778667 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.784297 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-scripts" (OuterVolumeSpecName: "scripts") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.786350 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.826208 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8dd71512-2453-4dff-98d8-3cf981fbbb8f-kube-api-access-4szlr" (OuterVolumeSpecName: "kube-api-access-4szlr") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "kube-api-access-4szlr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.859885 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.888945 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889014 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889088 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6b89n\" (UniqueName: \"kubernetes.io/projected/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-kube-api-access-6b89n\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889122 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889175 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-logs\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889250 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889274 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889386 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.889932 4765 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-httpd-run\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.890103 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.890190 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.890296 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.890386 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4szlr\" (UniqueName: \"kubernetes.io/projected/8dd71512-2453-4dff-98d8-3cf981fbbb8f-kube-api-access-4szlr\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.891898 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8dd71512-2453-4dff-98d8-3cf981fbbb8f-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.894223 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.895432 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-config-data" (OuterVolumeSpecName: "config-data") pod "8dd71512-2453-4dff-98d8-3cf981fbbb8f" (UID: "8dd71512-2453-4dff-98d8-3cf981fbbb8f"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.931884 4765 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.985267 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.989347 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.994645 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.995060 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.995195 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.995424 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6b89n\" (UniqueName: \"kubernetes.io/projected/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-kube-api-access-6b89n\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.995592 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.995905 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-logs\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.996125 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.996261 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.996478 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.996624 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/8dd71512-2453-4dff-98d8-3cf981fbbb8f-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.996728 4765 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.996938 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") device mount path \"/mnt/openstack/pv08\"" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.997735 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-logs\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:48 crc kubenswrapper[4765]: I0319 10:43:48.998060 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:48.999848 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.000846 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-combined-ca-bundle\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.002093 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-internal-tls-certs\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.003221 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.037577 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6b89n\" (UniqueName: \"kubernetes.io/projected/e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02-kube-api-access-6b89n\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.051561 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-internal-api-0\" (UID: \"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02\") " pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.097927 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-scripts\") pod \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.097992 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-combined-ca-bundle\") pod \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098033 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-config\") pod \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098074 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tnjqj\" (UniqueName: \"kubernetes.io/projected/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-kube-api-access-tnjqj\") pod \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098140 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-run-httpd\") pod \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098178 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-log-httpd\") pod \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098203 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2z27k\" (UniqueName: \"kubernetes.io/projected/ab7915d2-c641-481f-a9f6-1ce1209c7e17-kube-api-access-2z27k\") pod \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098246 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-combined-ca-bundle\") pod \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098313 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-sg-core-conf-yaml\") pod \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098353 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-httpd-config\") pod \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098415 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-config-data\") pod \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\" (UID: \"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.098432 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-ovndb-tls-certs\") pod \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\" (UID: \"ab7915d2-c641-481f-a9f6-1ce1209c7e17\") " Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.102288 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" (UID: "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.102597 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-scripts" (OuterVolumeSpecName: "scripts") pod "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" (UID: "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.105329 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" (UID: "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.105320 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-httpd-config" (OuterVolumeSpecName: "httpd-config") pod "ab7915d2-c641-481f-a9f6-1ce1209c7e17" (UID: "ab7915d2-c641-481f-a9f6-1ce1209c7e17"). InnerVolumeSpecName "httpd-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.106475 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-kube-api-access-tnjqj" (OuterVolumeSpecName: "kube-api-access-tnjqj") pod "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" (UID: "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e"). InnerVolumeSpecName "kube-api-access-tnjqj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.119426 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab7915d2-c641-481f-a9f6-1ce1209c7e17-kube-api-access-2z27k" (OuterVolumeSpecName: "kube-api-access-2z27k") pod "ab7915d2-c641-481f-a9f6-1ce1209c7e17" (UID: "ab7915d2-c641-481f-a9f6-1ce1209c7e17"). InnerVolumeSpecName "kube-api-access-2z27k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.174098 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" (UID: "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.182634 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-config" (OuterVolumeSpecName: "config") pod "ab7915d2-c641-481f-a9f6-1ce1209c7e17" (UID: "ab7915d2-c641-481f-a9f6-1ce1209c7e17"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.201892 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.201944 4765 reconciler_common.go:293] "Volume detached for volume \"httpd-config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-httpd-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.201977 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.201992 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.202004 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tnjqj\" (UniqueName: \"kubernetes.io/projected/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-kube-api-access-tnjqj\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.202019 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.202028 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.202042 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2z27k\" (UniqueName: \"kubernetes.io/projected/ab7915d2-c641-481f-a9f6-1ce1209c7e17-kube-api-access-2z27k\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.220271 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ab7915d2-c641-481f-a9f6-1ce1209c7e17" (UID: "ab7915d2-c641-481f-a9f6-1ce1209c7e17"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.222263 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-ovndb-tls-certs" (OuterVolumeSpecName: "ovndb-tls-certs") pod "ab7915d2-c641-481f-a9f6-1ce1209c7e17" (UID: "ab7915d2-c641-481f-a9f6-1ce1209c7e17"). InnerVolumeSpecName "ovndb-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.250109 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" (UID: "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.277003 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-config-data" (OuterVolumeSpecName: "config-data") pod "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" (UID: "3bfdc94f-3c5d-47de-8d4a-59d804b9b68e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.280273 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.304066 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.304109 4765 reconciler_common.go:293] "Volume detached for volume \"ovndb-tls-certs\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-ovndb-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.304125 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.304136 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab7915d2-c641-481f-a9f6-1ce1209c7e17-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.323053 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-31ef-account-create-update-kcvq5"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.351992 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-db-create-bg8xf"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.360912 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-db-create-7fsgp"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.370310 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-db-create-bgz4n"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.380764 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-e031-account-create-update-smc8t"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.452815 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-4a7c-account-create-update-tbxvw"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.486180 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/swift-proxy-556979b4dc-zj26d"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.604276 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e031-account-create-update-smc8t" event={"ID":"63fab428-8477-40cd-bd57-250471e0d108","Type":"ContainerStarted","Data":"fc02afca5fbceeaa6de64fa18e457603b31e91e0b926f1503b6179ee7151fda8"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.610828 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bgz4n" event={"ID":"1cb1943e-d6aa-4223-8654-5e674a71b734","Type":"ContainerStarted","Data":"f180e3b23a424b3b30041d49adb8d47535a437dda1c32556b9e3d83a8bf9b4be"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.615702 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-556979b4dc-zj26d" event={"ID":"00e0de39-87cf-4a6e-8980-a294f329e430","Type":"ContainerStarted","Data":"c1f4864b4ff82412d7a857bfa108b91f9dbdff452dfc2d5dc3d04ac16e6ed8d7"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.618380 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"8dd71512-2453-4dff-98d8-3cf981fbbb8f","Type":"ContainerDied","Data":"3f490d1e63a9e9ef60bbe99ababdfcaa0de7a8f4eb2e6bc867039f2492ba6c2e"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.618412 4765 scope.go:117] "RemoveContainer" containerID="ecd8b3390445b80ae72265c157bbdc070db369ac9b9f3af9d421ae1b82ddbdb2" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.618887 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.633693 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-66fb7cb9f6-g7xpk" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.633666 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-66fb7cb9f6-g7xpk" event={"ID":"ab7915d2-c641-481f-a9f6-1ce1209c7e17","Type":"ContainerDied","Data":"d9a30db13c825fc0be017e8fb75f92508819e713cf7d8ca09ec58af79d3b329b"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.639515 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" event={"ID":"5de4629f-4496-4991-962f-4410df18a713","Type":"ContainerStarted","Data":"82b13eed0efad2fb33c9f2bed7ee5a8346a5b7725e51e81fb294faa5038b6c4f"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.650950 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" event={"ID":"97833765-fe7a-40eb-9764-180d2123e113","Type":"ContainerStarted","Data":"6c7ead3ff5d5cc72a973108d128176d9061d792c35f53af0ad5f0852ee0873f7"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.656394 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"3bfdc94f-3c5d-47de-8d4a-59d804b9b68e","Type":"ContainerDied","Data":"cb325346cdc1417ef37ac90ce95f3a5ab8088a09a05964b5b8c0269c2c01a398"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.656510 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.674236 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/openstackclient" event={"ID":"1cda8252-a988-49d1-a566-8d9989b86034","Type":"ContainerStarted","Data":"cb30fb3c5298a52d1e9d1744777aafa3365c4a1e3fb5aa4d55bdaaf67ae350b4"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.695709 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7fsgp" event={"ID":"de70af7a-9885-40d1-868d-14c156308212","Type":"ContainerStarted","Data":"da8df0bfadb1aa041d4809c5ec2518f3b3ff5d206f63f3af811baf09d98880b1"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.699313 4765 scope.go:117] "RemoveContainer" containerID="b6ebe68d3a95bc401568b3ba8caa6bf11c12bccbcf18901317915931dd3c8093" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.702743 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/openstackclient" podStartSLOduration=3.093093929 podStartE2EDuration="16.702720028s" podCreationTimestamp="2026-03-19 10:43:33 +0000 UTC" firstStartedPulling="2026-03-19 10:43:34.741203472 +0000 UTC m=+1313.090149014" lastFinishedPulling="2026-03-19 10:43:48.350829571 +0000 UTC m=+1326.699775113" observedRunningTime="2026-03-19 10:43:49.694437406 +0000 UTC m=+1328.043382948" watchObservedRunningTime="2026-03-19 10:43:49.702720028 +0000 UTC m=+1328.051665570" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.704544 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bg8xf" event={"ID":"d0821b55-a4e1-4b0f-af18-513aefaa8d9e","Type":"ContainerStarted","Data":"dd535de9d8fb22d4e9eba0aa40bed36dea42af2aa79868c5814aa1152160d4a2"} Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.899801 4765 scope.go:117] "RemoveContainer" containerID="49b4eaf7fb3307781adb143ca4b6178181d60e6e330bab6ca84d5ec1f928af9b" Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.923897 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:43:49 crc kubenswrapper[4765]: I0319 10:43:49.973390 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.025382 4765 scope.go:117] "RemoveContainer" containerID="402155193e5eb4df46bcd8d42397b51050e1b818f7e03209b35eab08249dd7aa" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.031881 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: E0319 10:43:50.039158 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-notification-agent" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.039193 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-notification-agent" Mar 19 10:43:50 crc kubenswrapper[4765]: E0319 10:43:50.039239 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="sg-core" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.039269 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="sg-core" Mar 19 10:43:50 crc kubenswrapper[4765]: E0319 10:43:50.039285 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="proxy-httpd" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.039291 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="proxy-httpd" Mar 19 10:43:50 crc kubenswrapper[4765]: E0319 10:43:50.039315 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-central-agent" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.039322 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-central-agent" Mar 19 10:43:50 crc kubenswrapper[4765]: E0319 10:43:50.039354 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-api" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.039360 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-api" Mar 19 10:43:50 crc kubenswrapper[4765]: E0319 10:43:50.039385 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-httpd" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.039394 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-httpd" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.040354 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-api" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.040412 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-notification-agent" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.040431 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="ceilometer-central-agent" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.040449 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="proxy-httpd" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.040471 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" containerName="sg-core" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.040502 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" containerName="neutron-httpd" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.044330 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.050112 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"glance-default-external-config-data" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.050358 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-glance-default-public-svc" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.088757 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-internal-api-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.117878 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135135 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135207 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135235 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135458 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135532 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g8mfx\" (UniqueName: \"kubernetes.io/projected/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-kube-api-access-g8mfx\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135594 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-logs\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135765 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.135891 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.141508 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-66fb7cb9f6-g7xpk"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.155252 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-66fb7cb9f6-g7xpk"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.166220 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.189048 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.201080 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.203425 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.207818 4765 scope.go:117] "RemoveContainer" containerID="7c87532bb51573b7b5457433c5fdf1d8df5eab8acce3a55904cfafe7b1b8e55f" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.207886 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.208141 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.209828 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238576 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g8mfx\" (UniqueName: \"kubernetes.io/projected/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-kube-api-access-g8mfx\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238631 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-scripts\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238650 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-logs\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238691 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238708 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238753 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238769 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238787 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-config-data\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238804 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238852 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-run-httpd\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238870 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-log-httpd\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238893 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4l72\" (UniqueName: \"kubernetes.io/projected/200aa20d-4c08-468b-b3e9-624d474124b3-kube-api-access-r4l72\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238906 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238921 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.238952 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.240265 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.244053 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-logs\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.247167 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") device mount path \"/mnt/openstack/pv07\"" pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.250902 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-public-tls-certs\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.260021 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-combined-ca-bundle\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.260456 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-scripts\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.260837 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-config-data\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.266307 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g8mfx\" (UniqueName: \"kubernetes.io/projected/0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac-kube-api-access-g8mfx\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.271186 4765 scope.go:117] "RemoveContainer" containerID="cad56476ad5b846839baa9d8083c1e920e626f18eed117f98b7234f4d1eb4095" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.301027 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-external-api-0\" (UID: \"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac\") " pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.339906 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-config-data\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.340344 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-run-httpd\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.340372 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-log-httpd\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.340402 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4l72\" (UniqueName: \"kubernetes.io/projected/200aa20d-4c08-468b-b3e9-624d474124b3-kube-api-access-r4l72\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.340425 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.340447 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.340503 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-scripts\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.342022 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-log-httpd\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.346319 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-run-httpd\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.350872 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.365394 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-config-data\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.365677 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.370734 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-scripts\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.373667 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4l72\" (UniqueName: \"kubernetes.io/projected/200aa20d-4c08-468b-b3e9-624d474124b3-kube-api-access-r4l72\") pod \"ceilometer-0\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.376835 4765 scope.go:117] "RemoveContainer" containerID="de7c5f366d67ff6daf17d6353f3ea686deb394926d0e0300eca17c370e527207" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.384140 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bfdc94f-3c5d-47de-8d4a-59d804b9b68e" path="/var/lib/kubelet/pods/3bfdc94f-3c5d-47de-8d4a-59d804b9b68e/volumes" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.384930 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8dd71512-2453-4dff-98d8-3cf981fbbb8f" path="/var/lib/kubelet/pods/8dd71512-2453-4dff-98d8-3cf981fbbb8f/volumes" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.386553 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a083bcfd-87a7-43f7-b0a3-1180bea648b3" path="/var/lib/kubelet/pods/a083bcfd-87a7-43f7-b0a3-1180bea648b3/volumes" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.388899 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab7915d2-c641-481f-a9f6-1ce1209c7e17" path="/var/lib/kubelet/pods/ab7915d2-c641-481f-a9f6-1ce1209c7e17/volumes" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.436464 4765 scope.go:117] "RemoveContainer" containerID="35017f4b28567434d5a7365d0acc8138f4fe28cd23ffc3b505a53cf519cf71f9" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.495981 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/glance-default-external-api-0" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.530038 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:43:50 crc kubenswrapper[4765]: E0319 10:43:50.625091 4765 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/9d54d0baa4119778b75b8f678ad4daf2ce3990618955c72a38e2a6af28ba96c0/diff" to get inode usage: stat /var/lib/containers/storage/overlay/9d54d0baa4119778b75b8f678ad4daf2ce3990618955c72a38e2a6af28ba96c0/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-external-api-0_8dd71512-2453-4dff-98d8-3cf981fbbb8f/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-external-api-0_8dd71512-2453-4dff-98d8-3cf981fbbb8f/glance-httpd/0.log: no such file or directory Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.721365 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02","Type":"ContainerStarted","Data":"2970fcd8db3b5aed0cde72d713346b2063bdb72724ae88c82f4efde3e68322e2"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.724309 4765 generic.go:334] "Generic (PLEG): container finished" podID="5de4629f-4496-4991-962f-4410df18a713" containerID="b7e33e2b91b09b641878bcc075c4d7b5fb8eb68bd0db94f878cf3b9382d50b4b" exitCode=0 Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.724373 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" event={"ID":"5de4629f-4496-4991-962f-4410df18a713","Type":"ContainerDied","Data":"b7e33e2b91b09b641878bcc075c4d7b5fb8eb68bd0db94f878cf3b9382d50b4b"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.736973 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" event={"ID":"97833765-fe7a-40eb-9764-180d2123e113","Type":"ContainerStarted","Data":"edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.745605 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e031-account-create-update-smc8t" event={"ID":"63fab428-8477-40cd-bd57-250471e0d108","Type":"ContainerStarted","Data":"e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.774191 4765 generic.go:334] "Generic (PLEG): container finished" podID="d0821b55-a4e1-4b0f-af18-513aefaa8d9e" containerID="1c943e9ec8e4b88b50621f0140f2ff452c402bcd733d403b06e877382ef1c5e2" exitCode=0 Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.774772 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bg8xf" event={"ID":"d0821b55-a4e1-4b0f-af18-513aefaa8d9e","Type":"ContainerDied","Data":"1c943e9ec8e4b88b50621f0140f2ff452c402bcd733d403b06e877382ef1c5e2"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.781210 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bgz4n" event={"ID":"1cb1943e-d6aa-4223-8654-5e674a71b734","Type":"ContainerStarted","Data":"213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.796470 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" podStartSLOduration=7.796443603 podStartE2EDuration="7.796443603s" podCreationTimestamp="2026-03-19 10:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:50.775403263 +0000 UTC m=+1329.124348805" watchObservedRunningTime="2026-03-19 10:43:50.796443603 +0000 UTC m=+1329.145389145" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.804409 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-556979b4dc-zj26d" event={"ID":"00e0de39-87cf-4a6e-8980-a294f329e430","Type":"ContainerStarted","Data":"2b4f088a3d6715717daffd654b47247a88e017f2ed45a9c271e3cf580abeac73"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.804462 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/swift-proxy-556979b4dc-zj26d" event={"ID":"00e0de39-87cf-4a6e-8980-a294f329e430","Type":"ContainerStarted","Data":"02598ca2c9439d81daa406cef91f610dd739eb4663f79ae6b71ff3fff110feb9"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.804876 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.805020 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.838382 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-e031-account-create-update-smc8t" podStartSLOduration=6.838347439 podStartE2EDuration="6.838347439s" podCreationTimestamp="2026-03-19 10:43:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:50.818051679 +0000 UTC m=+1329.166997221" watchObservedRunningTime="2026-03-19 10:43:50.838347439 +0000 UTC m=+1329.187292981" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.845108 4765 generic.go:334] "Generic (PLEG): container finished" podID="de70af7a-9885-40d1-868d-14c156308212" containerID="0d09b81495bf5da85632e84b0aaea7be01f3b5e425bd8111c3eaf7ead482fd6f" exitCode=0 Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.845180 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7fsgp" event={"ID":"de70af7a-9885-40d1-868d-14c156308212","Type":"ContainerDied","Data":"0d09b81495bf5da85632e84b0aaea7be01f3b5e425bd8111c3eaf7ead482fd6f"} Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.868435 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-db-create-bgz4n" podStartSLOduration=7.868401091 podStartE2EDuration="7.868401091s" podCreationTimestamp="2026-03-19 10:43:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:50.835982342 +0000 UTC m=+1329.184927894" watchObservedRunningTime="2026-03-19 10:43:50.868401091 +0000 UTC m=+1329.217346633" Mar 19 10:43:50 crc kubenswrapper[4765]: I0319 10:43:50.880561 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/swift-proxy-556979b4dc-zj26d" podStartSLOduration=8.880542572 podStartE2EDuration="8.880542572s" podCreationTimestamp="2026-03-19 10:43:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:50.860344785 +0000 UTC m=+1329.209290347" watchObservedRunningTime="2026-03-19 10:43:50.880542572 +0000 UTC m=+1329.229488114" Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.429828 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/glance-default-external-api-0"] Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.444917 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:43:51 crc kubenswrapper[4765]: E0319 10:43:51.749788 4765 fsHandler.go:119] failed to collect filesystem stats - rootDiskErr: could not stat "/var/lib/containers/storage/overlay/66d7512c0b7dfca0a1a5cd0bdff1d8a06ca361ea8436ea4042ad3da71b640e07/diff" to get inode usage: stat /var/lib/containers/storage/overlay/66d7512c0b7dfca0a1a5cd0bdff1d8a06ca361ea8436ea4042ad3da71b640e07/diff: no such file or directory, extraDiskErr: could not stat "/var/log/pods/openstack_glance-default-internal-api-0_a083bcfd-87a7-43f7-b0a3-1180bea648b3/glance-httpd/0.log" to get inode usage: stat /var/log/pods/openstack_glance-default-internal-api-0_a083bcfd-87a7-43f7-b0a3-1180bea648b3/glance-httpd/0.log: no such file or directory Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.905460 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerStarted","Data":"7cd22f243003b4007e330747994313a4db66277b89a504c50e4e2f72d26f0951"} Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.915129 4765 generic.go:334] "Generic (PLEG): container finished" podID="97833765-fe7a-40eb-9764-180d2123e113" containerID="edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236" exitCode=0 Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.915212 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" event={"ID":"97833765-fe7a-40eb-9764-180d2123e113","Type":"ContainerDied","Data":"edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236"} Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.935017 4765 generic.go:334] "Generic (PLEG): container finished" podID="63fab428-8477-40cd-bd57-250471e0d108" containerID="e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b" exitCode=0 Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.935387 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e031-account-create-update-smc8t" event={"ID":"63fab428-8477-40cd-bd57-250471e0d108","Type":"ContainerDied","Data":"e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b"} Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.954882 4765 generic.go:334] "Generic (PLEG): container finished" podID="1cb1943e-d6aa-4223-8654-5e674a71b734" containerID="213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0" exitCode=0 Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.955221 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bgz4n" event={"ID":"1cb1943e-d6aa-4223-8654-5e674a71b734","Type":"ContainerDied","Data":"213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0"} Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.972107 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac","Type":"ContainerStarted","Data":"61cc55b6b55edc941ef836af3191505cdd2300ac2500d1bd2c4d1106a480daca"} Mar 19 10:43:51 crc kubenswrapper[4765]: I0319 10:43:51.985457 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02","Type":"ContainerStarted","Data":"5ef2c6393d012daf250012a82a6bfba68368e0381d347b1f051db0edaaa170e2"} Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.434403 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.609051 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de70af7a-9885-40d1-868d-14c156308212-operator-scripts\") pod \"de70af7a-9885-40d1-868d-14c156308212\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.610161 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2bmks\" (UniqueName: \"kubernetes.io/projected/de70af7a-9885-40d1-868d-14c156308212-kube-api-access-2bmks\") pod \"de70af7a-9885-40d1-868d-14c156308212\" (UID: \"de70af7a-9885-40d1-868d-14c156308212\") " Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.611246 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/de70af7a-9885-40d1-868d-14c156308212-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "de70af7a-9885-40d1-868d-14c156308212" (UID: "de70af7a-9885-40d1-868d-14c156308212"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.619119 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de70af7a-9885-40d1-868d-14c156308212-kube-api-access-2bmks" (OuterVolumeSpecName: "kube-api-access-2bmks") pod "de70af7a-9885-40d1-868d-14c156308212" (UID: "de70af7a-9885-40d1-868d-14c156308212"). InnerVolumeSpecName "kube-api-access-2bmks". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.637844 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.649509 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.712291 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/de70af7a-9885-40d1-868d-14c156308212-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.712345 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2bmks\" (UniqueName: \"kubernetes.io/projected/de70af7a-9885-40d1-868d-14c156308212-kube-api-access-2bmks\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.814081 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de4629f-4496-4991-962f-4410df18a713-operator-scripts\") pod \"5de4629f-4496-4991-962f-4410df18a713\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.814215 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-operator-scripts\") pod \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.814296 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bjvj6\" (UniqueName: \"kubernetes.io/projected/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-kube-api-access-bjvj6\") pod \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\" (UID: \"d0821b55-a4e1-4b0f-af18-513aefaa8d9e\") " Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.814355 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qrfqp\" (UniqueName: \"kubernetes.io/projected/5de4629f-4496-4991-962f-4410df18a713-kube-api-access-qrfqp\") pod \"5de4629f-4496-4991-962f-4410df18a713\" (UID: \"5de4629f-4496-4991-962f-4410df18a713\") " Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.815776 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d0821b55-a4e1-4b0f-af18-513aefaa8d9e" (UID: "d0821b55-a4e1-4b0f-af18-513aefaa8d9e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.816232 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5de4629f-4496-4991-962f-4410df18a713-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5de4629f-4496-4991-962f-4410df18a713" (UID: "5de4629f-4496-4991-962f-4410df18a713"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.818850 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5de4629f-4496-4991-962f-4410df18a713-kube-api-access-qrfqp" (OuterVolumeSpecName: "kube-api-access-qrfqp") pod "5de4629f-4496-4991-962f-4410df18a713" (UID: "5de4629f-4496-4991-962f-4410df18a713"). InnerVolumeSpecName "kube-api-access-qrfqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.832137 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-kube-api-access-bjvj6" (OuterVolumeSpecName: "kube-api-access-bjvj6") pod "d0821b55-a4e1-4b0f-af18-513aefaa8d9e" (UID: "d0821b55-a4e1-4b0f-af18-513aefaa8d9e"). InnerVolumeSpecName "kube-api-access-bjvj6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.916546 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5de4629f-4496-4991-962f-4410df18a713-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.916581 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.916595 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bjvj6\" (UniqueName: \"kubernetes.io/projected/d0821b55-a4e1-4b0f-af18-513aefaa8d9e-kube-api-access-bjvj6\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.916606 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qrfqp\" (UniqueName: \"kubernetes.io/projected/5de4629f-4496-4991-962f-4410df18a713-kube-api-access-qrfqp\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:52 crc kubenswrapper[4765]: I0319 10:43:52.998735 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerStarted","Data":"ff9f488b9e6672417a0584c6eb408c902ff72e1bae985433151cc691d7fb4abe"} Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.000666 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-db-create-bg8xf" event={"ID":"d0821b55-a4e1-4b0f-af18-513aefaa8d9e","Type":"ContainerDied","Data":"dd535de9d8fb22d4e9eba0aa40bed36dea42af2aa79868c5814aa1152160d4a2"} Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.000695 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dd535de9d8fb22d4e9eba0aa40bed36dea42af2aa79868c5814aa1152160d4a2" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.000750 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-db-create-bg8xf" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.003500 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-db-create-7fsgp" event={"ID":"de70af7a-9885-40d1-868d-14c156308212","Type":"ContainerDied","Data":"da8df0bfadb1aa041d4809c5ec2518f3b3ff5d206f63f3af811baf09d98880b1"} Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.003550 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="da8df0bfadb1aa041d4809c5ec2518f3b3ff5d206f63f3af811baf09d98880b1" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.003628 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-db-create-7fsgp" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.022696 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac","Type":"ContainerStarted","Data":"11d123d6eecf0b0a360b7ed4e7529199f051c3d83b4b04c3b4ade6e9d9cd3824"} Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.028399 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-internal-api-0" event={"ID":"e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02","Type":"ContainerStarted","Data":"3662da078e758632e8d803f6d2e942efa021f6344a2d87e5367415e276051017"} Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.038804 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.048100 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-31ef-account-create-update-kcvq5" event={"ID":"5de4629f-4496-4991-962f-4410df18a713","Type":"ContainerDied","Data":"82b13eed0efad2fb33c9f2bed7ee5a8346a5b7725e51e81fb294faa5038b6c4f"} Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.048308 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82b13eed0efad2fb33c9f2bed7ee5a8346a5b7725e51e81fb294faa5038b6c4f" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.082622 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-internal-api-0" podStartSLOduration=5.082599444 podStartE2EDuration="5.082599444s" podCreationTimestamp="2026-03-19 10:43:48 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:53.075843734 +0000 UTC m=+1331.424789296" watchObservedRunningTime="2026-03-19 10:43:53.082599444 +0000 UTC m=+1331.431544986" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.441379 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.541685 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63fab428-8477-40cd-bd57-250471e0d108-operator-scripts\") pod \"63fab428-8477-40cd-bd57-250471e0d108\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.542003 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m94rw\" (UniqueName: \"kubernetes.io/projected/63fab428-8477-40cd-bd57-250471e0d108-kube-api-access-m94rw\") pod \"63fab428-8477-40cd-bd57-250471e0d108\" (UID: \"63fab428-8477-40cd-bd57-250471e0d108\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.551841 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/63fab428-8477-40cd-bd57-250471e0d108-kube-api-access-m94rw" (OuterVolumeSpecName: "kube-api-access-m94rw") pod "63fab428-8477-40cd-bd57-250471e0d108" (UID: "63fab428-8477-40cd-bd57-250471e0d108"). InnerVolumeSpecName "kube-api-access-m94rw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.556755 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/63fab428-8477-40cd-bd57-250471e0d108-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "63fab428-8477-40cd-bd57-250471e0d108" (UID: "63fab428-8477-40cd-bd57-250471e0d108"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.648829 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m94rw\" (UniqueName: \"kubernetes.io/projected/63fab428-8477-40cd-bd57-250471e0d108-kube-api-access-m94rw\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.648872 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/63fab428-8477-40cd-bd57-250471e0d108-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.679591 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.698298 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.698760 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.864618 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-combined-ca-bundle\") pod \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.864691 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97833765-fe7a-40eb-9764-180d2123e113-operator-scripts\") pod \"97833765-fe7a-40eb-9764-180d2123e113\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.864799 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4wrv8\" (UniqueName: \"kubernetes.io/projected/f567275e-0c40-4ef2-8c5f-fb40aad223f8-kube-api-access-4wrv8\") pod \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.864853 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data-custom\") pod \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.864941 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hswp2\" (UniqueName: \"kubernetes.io/projected/1cb1943e-d6aa-4223-8654-5e674a71b734-kube-api-access-hswp2\") pod \"1cb1943e-d6aa-4223-8654-5e674a71b734\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.864982 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data\") pod \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.865019 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-scripts\") pod \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.865066 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f567275e-0c40-4ef2-8c5f-fb40aad223f8-etc-machine-id\") pod \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.865121 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f567275e-0c40-4ef2-8c5f-fb40aad223f8-logs\") pod \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\" (UID: \"f567275e-0c40-4ef2-8c5f-fb40aad223f8\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.865168 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k7gv\" (UniqueName: \"kubernetes.io/projected/97833765-fe7a-40eb-9764-180d2123e113-kube-api-access-6k7gv\") pod \"97833765-fe7a-40eb-9764-180d2123e113\" (UID: \"97833765-fe7a-40eb-9764-180d2123e113\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.865254 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb1943e-d6aa-4223-8654-5e674a71b734-operator-scripts\") pod \"1cb1943e-d6aa-4223-8654-5e674a71b734\" (UID: \"1cb1943e-d6aa-4223-8654-5e674a71b734\") " Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.866100 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/97833765-fe7a-40eb-9764-180d2123e113-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "97833765-fe7a-40eb-9764-180d2123e113" (UID: "97833765-fe7a-40eb-9764-180d2123e113"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.866407 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1cb1943e-d6aa-4223-8654-5e674a71b734-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "1cb1943e-d6aa-4223-8654-5e674a71b734" (UID: "1cb1943e-d6aa-4223-8654-5e674a71b734"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.866598 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f567275e-0c40-4ef2-8c5f-fb40aad223f8-etc-machine-id" (OuterVolumeSpecName: "etc-machine-id") pod "f567275e-0c40-4ef2-8c5f-fb40aad223f8" (UID: "f567275e-0c40-4ef2-8c5f-fb40aad223f8"). InnerVolumeSpecName "etc-machine-id". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.870733 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f567275e-0c40-4ef2-8c5f-fb40aad223f8-logs" (OuterVolumeSpecName: "logs") pod "f567275e-0c40-4ef2-8c5f-fb40aad223f8" (UID: "f567275e-0c40-4ef2-8c5f-fb40aad223f8"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.870966 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1cb1943e-d6aa-4223-8654-5e674a71b734-kube-api-access-hswp2" (OuterVolumeSpecName: "kube-api-access-hswp2") pod "1cb1943e-d6aa-4223-8654-5e674a71b734" (UID: "1cb1943e-d6aa-4223-8654-5e674a71b734"). InnerVolumeSpecName "kube-api-access-hswp2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.874536 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-scripts" (OuterVolumeSpecName: "scripts") pod "f567275e-0c40-4ef2-8c5f-fb40aad223f8" (UID: "f567275e-0c40-4ef2-8c5f-fb40aad223f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.876305 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f567275e-0c40-4ef2-8c5f-fb40aad223f8-kube-api-access-4wrv8" (OuterVolumeSpecName: "kube-api-access-4wrv8") pod "f567275e-0c40-4ef2-8c5f-fb40aad223f8" (UID: "f567275e-0c40-4ef2-8c5f-fb40aad223f8"). InnerVolumeSpecName "kube-api-access-4wrv8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.899915 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/97833765-fe7a-40eb-9764-180d2123e113-kube-api-access-6k7gv" (OuterVolumeSpecName: "kube-api-access-6k7gv") pod "97833765-fe7a-40eb-9764-180d2123e113" (UID: "97833765-fe7a-40eb-9764-180d2123e113"). InnerVolumeSpecName "kube-api-access-6k7gv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.931250 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data-custom" (OuterVolumeSpecName: "config-data-custom") pod "f567275e-0c40-4ef2-8c5f-fb40aad223f8" (UID: "f567275e-0c40-4ef2-8c5f-fb40aad223f8"). InnerVolumeSpecName "config-data-custom". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.948134 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data" (OuterVolumeSpecName: "config-data") pod "f567275e-0c40-4ef2-8c5f-fb40aad223f8" (UID: "f567275e-0c40-4ef2-8c5f-fb40aad223f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.968921 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/f567275e-0c40-4ef2-8c5f-fb40aad223f8-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.968969 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6k7gv\" (UniqueName: \"kubernetes.io/projected/97833765-fe7a-40eb-9764-180d2123e113-kube-api-access-6k7gv\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.968986 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/1cb1943e-d6aa-4223-8654-5e674a71b734-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.968995 4765 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/97833765-fe7a-40eb-9764-180d2123e113-operator-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.969007 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4wrv8\" (UniqueName: \"kubernetes.io/projected/f567275e-0c40-4ef2-8c5f-fb40aad223f8-kube-api-access-4wrv8\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.969015 4765 reconciler_common.go:293] "Volume detached for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data-custom\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.969023 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hswp2\" (UniqueName: \"kubernetes.io/projected/1cb1943e-d6aa-4223-8654-5e674a71b734-kube-api-access-hswp2\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.969031 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.969039 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.969046 4765 reconciler_common.go:293] "Volume detached for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/f567275e-0c40-4ef2-8c5f-fb40aad223f8-etc-machine-id\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:53 crc kubenswrapper[4765]: I0319 10:43:53.970558 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f567275e-0c40-4ef2-8c5f-fb40aad223f8" (UID: "f567275e-0c40-4ef2-8c5f-fb40aad223f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.052427 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/glance-default-external-api-0" event={"ID":"0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac","Type":"ContainerStarted","Data":"c1444379d3875b28d8f1db3dca75b1ae184b25f806a2f0110efedae8f167b93e"} Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.055097 4765 generic.go:334] "Generic (PLEG): container finished" podID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerID="1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf" exitCode=137 Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.055191 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.055176 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f567275e-0c40-4ef2-8c5f-fb40aad223f8","Type":"ContainerDied","Data":"1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf"} Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.056700 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"f567275e-0c40-4ef2-8c5f-fb40aad223f8","Type":"ContainerDied","Data":"23b661699f9206411e682dfbf2c1e896ea6afe4e7c932d35aa1382d0bcce9704"} Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.056740 4765 scope.go:117] "RemoveContainer" containerID="1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.063032 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerStarted","Data":"97fa4bb2fc6965385bfc207947dd3dcb057fd521b66ab9225e2953eed87ef104"} Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.070314 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" event={"ID":"97833765-fe7a-40eb-9764-180d2123e113","Type":"ContainerDied","Data":"6c7ead3ff5d5cc72a973108d128176d9061d792c35f53af0ad5f0852ee0873f7"} Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.070371 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7ead3ff5d5cc72a973108d128176d9061d792c35f53af0ad5f0852ee0873f7" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.070468 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-4a7c-account-create-update-tbxvw" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.071692 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f567275e-0c40-4ef2-8c5f-fb40aad223f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.082830 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/glance-default-external-api-0" podStartSLOduration=5.082809606 podStartE2EDuration="5.082809606s" podCreationTimestamp="2026-03-19 10:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:54.07547206 +0000 UTC m=+1332.424417612" watchObservedRunningTime="2026-03-19 10:43:54.082809606 +0000 UTC m=+1332.431755138" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.086474 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-e031-account-create-update-smc8t" event={"ID":"63fab428-8477-40cd-bd57-250471e0d108","Type":"ContainerDied","Data":"fc02afca5fbceeaa6de64fa18e457603b31e91e0b926f1503b6179ee7151fda8"} Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.086515 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fc02afca5fbceeaa6de64fa18e457603b31e91e0b926f1503b6179ee7151fda8" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.086592 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-e031-account-create-update-smc8t" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.095158 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-db-create-bgz4n" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.095458 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-db-create-bgz4n" event={"ID":"1cb1943e-d6aa-4223-8654-5e674a71b734","Type":"ContainerDied","Data":"f180e3b23a424b3b30041d49adb8d47535a437dda1c32556b9e3d83a8bf9b4be"} Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.095522 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f180e3b23a424b3b30041d49adb8d47535a437dda1c32556b9e3d83a8bf9b4be" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.109667 4765 scope.go:117] "RemoveContainer" containerID="e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.150177 4765 scope.go:117] "RemoveContainer" containerID="1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.151203 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf\": container with ID starting with 1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf not found: ID does not exist" containerID="1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.151279 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf"} err="failed to get container status \"1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf\": rpc error: code = NotFound desc = could not find container \"1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf\": container with ID starting with 1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf not found: ID does not exist" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.151316 4765 scope.go:117] "RemoveContainer" containerID="e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.151626 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7\": container with ID starting with e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7 not found: ID does not exist" containerID="e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.151668 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7"} err="failed to get container status \"e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7\": rpc error: code = NotFound desc = could not find container \"e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7\": container with ID starting with e2f21f434018f81ca084379be893dd45c99b806338b7261cd698760fdfdad3e7 not found: ID does not exist" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.176066 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.187025 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209038 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209477 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5de4629f-4496-4991-962f-4410df18a713" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209496 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5de4629f-4496-4991-962f-4410df18a713" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209512 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api-log" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209519 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api-log" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209534 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="63fab428-8477-40cd-bd57-250471e0d108" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209540 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="63fab428-8477-40cd-bd57-250471e0d108" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209551 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1cb1943e-d6aa-4223-8654-5e674a71b734" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209556 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1cb1943e-d6aa-4223-8654-5e674a71b734" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209570 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="97833765-fe7a-40eb-9764-180d2123e113" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209576 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="97833765-fe7a-40eb-9764-180d2123e113" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209584 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="de70af7a-9885-40d1-868d-14c156308212" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209591 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="de70af7a-9885-40d1-868d-14c156308212" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209599 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0821b55-a4e1-4b0f-af18-513aefaa8d9e" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209604 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0821b55-a4e1-4b0f-af18-513aefaa8d9e" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: E0319 10:43:54.209630 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209636 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209822 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0821b55-a4e1-4b0f-af18-513aefaa8d9e" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209836 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="63fab428-8477-40cd-bd57-250471e0d108" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209843 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="97833765-fe7a-40eb-9764-180d2123e113" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209852 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209869 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="de70af7a-9885-40d1-868d-14c156308212" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209879 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1cb1943e-d6aa-4223-8654-5e674a71b734" containerName="mariadb-database-create" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209889 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" containerName="cinder-api-log" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.209896 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5de4629f-4496-4991-962f-4410df18a713" containerName="mariadb-account-create-update" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.211003 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.214038 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cinder-api-config-data" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.214454 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-public-svc" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.214594 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-cinder-internal-svc" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.214820 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.368492 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f567275e-0c40-4ef2-8c5f-fb40aad223f8" path="/var/lib/kubelet/pods/f567275e-0c40-4ef2-8c5f-fb40aad223f8/volumes" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379403 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-scripts\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379489 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-config-data\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379562 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379594 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-config-data-custom\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379628 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379650 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-knmxj\" (UniqueName: \"kubernetes.io/projected/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-kube-api-access-knmxj\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379688 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379717 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.379740 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-logs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482011 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482106 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-knmxj\" (UniqueName: \"kubernetes.io/projected/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-kube-api-access-knmxj\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482218 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482259 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482299 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-logs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482338 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-scripts\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482427 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-config-data\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482528 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.482587 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-config-data-custom\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.484018 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-machine-id\" (UniqueName: \"kubernetes.io/host-path/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-etc-machine-id\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.484308 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-logs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.490469 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-public-tls-certs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.490875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-combined-ca-bundle\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.492757 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-custom\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-config-data-custom\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.492989 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-scripts\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.494551 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-internal-tls-certs\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.497065 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-config-data\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.513655 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-knmxj\" (UniqueName: \"kubernetes.io/projected/fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7-kube-api-access-knmxj\") pod \"cinder-api-0\" (UID: \"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7\") " pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.542014 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/cinder-api-0" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.567224 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvf9r"] Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.568790 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.572652 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t8r4j" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.572879 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-scripts" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.573085 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.585882 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvf9r"] Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.686979 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-scripts\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.687065 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z746p\" (UniqueName: \"kubernetes.io/projected/d0411109-7b7f-4013-baef-8970df3e2dbf-kube-api-access-z746p\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.687246 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-config-data\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.687369 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.790882 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-config-data\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.791024 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.791114 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-scripts\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.791147 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z746p\" (UniqueName: \"kubernetes.io/projected/d0411109-7b7f-4013-baef-8970df3e2dbf-kube-api-access-z746p\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.798432 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-combined-ca-bundle\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.802756 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-config-data\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.825945 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z746p\" (UniqueName: \"kubernetes.io/projected/d0411109-7b7f-4013-baef-8970df3e2dbf-kube-api-access-z746p\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.826811 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-scripts\") pod \"nova-cell0-conductor-db-sync-rvf9r\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.893505 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:43:54 crc kubenswrapper[4765]: I0319 10:43:54.920727 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/cinder-api-0"] Mar 19 10:43:55 crc kubenswrapper[4765]: I0319 10:43:55.074497 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5c6bdcb6fb-89kxv" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 10:43:55 crc kubenswrapper[4765]: I0319 10:43:55.074915 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:43:55 crc kubenswrapper[4765]: I0319 10:43:55.127297 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7","Type":"ContainerStarted","Data":"68cf84d3a1b7fe5d14c69b9d96c89b102de3152b3bb78585c0d0fb626792346e"} Mar 19 10:43:55 crc kubenswrapper[4765]: I0319 10:43:55.145124 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerStarted","Data":"cf26aa1ad96c564828e882c98b81b4060dc6aab5f9862740505e34eea3a05d80"} Mar 19 10:43:55 crc kubenswrapper[4765]: I0319 10:43:55.421377 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvf9r"] Mar 19 10:43:56 crc kubenswrapper[4765]: I0319 10:43:56.160459 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7","Type":"ContainerStarted","Data":"56fec548b48365b6ec38c2f9afd1768523bb36e17dc70eb06320caab7841cf77"} Mar 19 10:43:56 crc kubenswrapper[4765]: I0319 10:43:56.166084 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" event={"ID":"d0411109-7b7f-4013-baef-8970df3e2dbf","Type":"ContainerStarted","Data":"11952f85695122fda5b1876001bdb567afc129c9bfef41bc331ba3acb65f30bf"} Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.179004 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerStarted","Data":"4c14f90d0fe810cc9ae32558a93507aacb75e8f2ea2ce8fd1cd87768eaa3c20c"} Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.179410 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.183001 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/cinder-api-0" event={"ID":"fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7","Type":"ContainerStarted","Data":"bf5b64e8fb507404709206bd69e87bdd46ed4dcf0c8cde8760d70dd7bcb04699"} Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.183665 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/cinder-api-0" Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.210130 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.822658706 podStartE2EDuration="8.210107628s" podCreationTimestamp="2026-03-19 10:43:49 +0000 UTC" firstStartedPulling="2026-03-19 10:43:51.464756448 +0000 UTC m=+1329.813701990" lastFinishedPulling="2026-03-19 10:43:56.85220537 +0000 UTC m=+1335.201150912" observedRunningTime="2026-03-19 10:43:57.202238177 +0000 UTC m=+1335.551183719" watchObservedRunningTime="2026-03-19 10:43:57.210107628 +0000 UTC m=+1335.559053170" Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.242686 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/cinder-api-0" podStartSLOduration=3.24266473 podStartE2EDuration="3.24266473s" podCreationTimestamp="2026-03-19 10:43:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:43:57.233603807 +0000 UTC m=+1335.582549369" watchObservedRunningTime="2026-03-19 10:43:57.24266473 +0000 UTC m=+1335.591610272" Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.432826 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:57 crc kubenswrapper[4765]: I0319 10:43:57.434107 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/swift-proxy-556979b4dc-zj26d" Mar 19 10:43:59 crc kubenswrapper[4765]: I0319 10:43:59.281018 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:59 crc kubenswrapper[4765]: I0319 10:43:59.281388 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:59 crc kubenswrapper[4765]: I0319 10:43:59.323516 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 10:43:59 crc kubenswrapper[4765]: I0319 10:43:59.338744 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-internal-api-0" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.135609 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565284-5vl4n"] Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.137060 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.143872 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.143931 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.148452 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.149513 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-5vl4n"] Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.216370 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.216540 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-internal-api-0" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.237794 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2922\" (UniqueName: \"kubernetes.io/projected/183750a8-660e-41fd-85f8-6deb34af3c2c-kube-api-access-r2922\") pod \"auto-csr-approver-29565284-5vl4n\" (UID: \"183750a8-660e-41fd-85f8-6deb34af3c2c\") " pod="openshift-infra/auto-csr-approver-29565284-5vl4n" Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.250802 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de4629f_4496_4991_962f_4410df18a713.slice/crio-82b13eed0efad2fb33c9f2bed7ee5a8346a5b7725e51e81fb294faa5038b6c4f": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de4629f_4496_4991_962f_4410df18a713.slice/crio-82b13eed0efad2fb33c9f2bed7ee5a8346a5b7725e51e81fb294faa5038b6c4f: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.251104 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0821b55_a4e1_4b0f_af18_513aefaa8d9e.slice/crio-dd535de9d8fb22d4e9eba0aa40bed36dea42af2aa79868c5814aa1152160d4a2": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0821b55_a4e1_4b0f_af18_513aefaa8d9e.slice/crio-dd535de9d8fb22d4e9eba0aa40bed36dea42af2aa79868c5814aa1152160d4a2: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.251232 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1943e_d6aa_4223_8654_5e674a71b734.slice/crio-f180e3b23a424b3b30041d49adb8d47535a437dda1c32556b9e3d83a8bf9b4be": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1943e_d6aa_4223_8654_5e674a71b734.slice/crio-f180e3b23a424b3b30041d49adb8d47535a437dda1c32556b9e3d83a8bf9b4be: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.251351 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63fab428_8477_40cd_bd57_250471e0d108.slice/crio-fc02afca5fbceeaa6de64fa18e457603b31e91e0b926f1503b6179ee7151fda8": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63fab428_8477_40cd_bd57_250471e0d108.slice/crio-fc02afca5fbceeaa6de64fa18e457603b31e91e0b926f1503b6179ee7151fda8: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.251740 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde70af7a_9885_40d1_868d_14c156308212.slice/crio-da8df0bfadb1aa041d4809c5ec2518f3b3ff5d206f63f3af811baf09d98880b1": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde70af7a_9885_40d1_868d_14c156308212.slice/crio-da8df0bfadb1aa041d4809c5ec2518f3b3ff5d206f63f3af811baf09d98880b1: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.251863 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97833765_fe7a_40eb_9764_180d2123e113.slice/crio-6c7ead3ff5d5cc72a973108d128176d9061d792c35f53af0ad5f0852ee0873f7": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97833765_fe7a_40eb_9764_180d2123e113.slice/crio-6c7ead3ff5d5cc72a973108d128176d9061d792c35f53af0ad5f0852ee0873f7: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252090 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de4629f_4496_4991_962f_4410df18a713.slice/crio-conmon-b7e33e2b91b09b641878bcc075c4d7b5fb8eb68bd0db94f878cf3b9382d50b4b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de4629f_4496_4991_962f_4410df18a713.slice/crio-conmon-b7e33e2b91b09b641878bcc075c4d7b5fb8eb68bd0db94f878cf3b9382d50b4b.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252178 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de4629f_4496_4991_962f_4410df18a713.slice/crio-b7e33e2b91b09b641878bcc075c4d7b5fb8eb68bd0db94f878cf3b9382d50b4b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de4629f_4496_4991_962f_4410df18a713.slice/crio-b7e33e2b91b09b641878bcc075c4d7b5fb8eb68bd0db94f878cf3b9382d50b4b.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252265 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0821b55_a4e1_4b0f_af18_513aefaa8d9e.slice/crio-conmon-1c943e9ec8e4b88b50621f0140f2ff452c402bcd733d403b06e877382ef1c5e2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0821b55_a4e1_4b0f_af18_513aefaa8d9e.slice/crio-conmon-1c943e9ec8e4b88b50621f0140f2ff452c402bcd733d403b06e877382ef1c5e2.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252351 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0821b55_a4e1_4b0f_af18_513aefaa8d9e.slice/crio-1c943e9ec8e4b88b50621f0140f2ff452c402bcd733d403b06e877382ef1c5e2.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0821b55_a4e1_4b0f_af18_513aefaa8d9e.slice/crio-1c943e9ec8e4b88b50621f0140f2ff452c402bcd733d403b06e877382ef1c5e2.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252424 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde70af7a_9885_40d1_868d_14c156308212.slice/crio-conmon-0d09b81495bf5da85632e84b0aaea7be01f3b5e425bd8111c3eaf7ead482fd6f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde70af7a_9885_40d1_868d_14c156308212.slice/crio-conmon-0d09b81495bf5da85632e84b0aaea7be01f3b5e425bd8111c3eaf7ead482fd6f.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252535 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1943e_d6aa_4223_8654_5e674a71b734.slice/crio-conmon-213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1943e_d6aa_4223_8654_5e674a71b734.slice/crio-conmon-213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252626 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63fab428_8477_40cd_bd57_250471e0d108.slice/crio-conmon-e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63fab428_8477_40cd_bd57_250471e0d108.slice/crio-conmon-e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252709 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97833765_fe7a_40eb_9764_180d2123e113.slice/crio-conmon-edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97833765_fe7a_40eb_9764_180d2123e113.slice/crio-conmon-edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.252777 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde70af7a_9885_40d1_868d_14c156308212.slice/crio-0d09b81495bf5da85632e84b0aaea7be01f3b5e425bd8111c3eaf7ead482fd6f.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde70af7a_9885_40d1_868d_14c156308212.slice/crio-0d09b81495bf5da85632e84b0aaea7be01f3b5e425bd8111c3eaf7ead482fd6f.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.253026 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63fab428_8477_40cd_bd57_250471e0d108.slice/crio-e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63fab428_8477_40cd_bd57_250471e0d108.slice/crio-e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.253200 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1943e_d6aa_4223_8654_5e674a71b734.slice/crio-213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1943e_d6aa_4223_8654_5e674a71b734.slice/crio-213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: W0319 10:44:00.253282 4765 watcher.go:93] Error while processing event ("/sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97833765_fe7a_40eb_9764_180d2123e113.slice/crio-edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236.scope": 0x40000100 == IN_CREATE|IN_ISDIR): inotify_add_watch /sys/fs/cgroup/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97833765_fe7a_40eb_9764_180d2123e113.slice/crio-edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236.scope: no such file or directory Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.340219 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r2922\" (UniqueName: \"kubernetes.io/projected/183750a8-660e-41fd-85f8-6deb34af3c2c-kube-api-access-r2922\") pod \"auto-csr-approver-29565284-5vl4n\" (UID: \"183750a8-660e-41fd-85f8-6deb34af3c2c\") " pod="openshift-infra/auto-csr-approver-29565284-5vl4n" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.380813 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2922\" (UniqueName: \"kubernetes.io/projected/183750a8-660e-41fd-85f8-6deb34af3c2c-kube-api-access-r2922\") pod \"auto-csr-approver-29565284-5vl4n\" (UID: \"183750a8-660e-41fd-85f8-6deb34af3c2c\") " pod="openshift-infra/auto-csr-approver-29565284-5vl4n" Mar 19 10:44:00 crc kubenswrapper[4765]: E0319 10:44:00.448068 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1cb1943e_d6aa_4223_8654_5e674a71b734.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bfdc94f_3c5d_47de_8d4a_59d804b9b68e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab7915d2_c641_481f_a9f6_1ce1209c7e17.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd0821b55_a4e1_4b0f_af18_513aefaa8d9e.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3bfdc94f_3c5d_47de_8d4a_59d804b9b68e.slice/crio-cb325346cdc1417ef37ac90ce95f3a5ab8088a09a05964b5b8c0269c2c01a398\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf567275e_0c40_4ef2_8c5f_fb40aad223f8.slice/crio-conmon-1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd71512_2453_4dff_98d8_3cf981fbbb8f.slice/crio-3f490d1e63a9e9ef60bbe99ababdfcaa0de7a8f4eb2e6bc867039f2492ba6c2e\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf567275e_0c40_4ef2_8c5f_fb40aad223f8.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8dd71512_2453_4dff_98d8_3cf981fbbb8f.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podde70af7a_9885_40d1_868d_14c156308212.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf567275e_0c40_4ef2_8c5f_fb40aad223f8.slice/crio-1a32fdc48ba6fef3a705f3889d8d3b36ffec2096785e8d52ad921d64bba59bdf.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5112f66b_28fa_4500_b77b_351b8c3d0519.slice/crio-conmon-9f412968986b7556b9d0cd9de4886ebe503ad2f5c2b7c5677168459667cc0902.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5de4629f_4496_4991_962f_4410df18a713.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod63fab428_8477_40cd_bd57_250471e0d108.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab7915d2_c641_481f_a9f6_1ce1209c7e17.slice/crio-d9a30db13c825fc0be017e8fb75f92508819e713cf7d8ca09ec58af79d3b329b\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf567275e_0c40_4ef2_8c5f_fb40aad223f8.slice/crio-23b661699f9206411e682dfbf2c1e896ea6afe4e7c932d35aa1382d0bcce9704\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod97833765_fe7a_40eb_9764_180d2123e113.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5112f66b_28fa_4500_b77b_351b8c3d0519.slice/crio-9f412968986b7556b9d0cd9de4886ebe503ad2f5c2b7c5677168459667cc0902.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.464421 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.523185 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.523232 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/glance-default-external-api-0" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.557342 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 10:44:00 crc kubenswrapper[4765]: I0319 10:44:00.568013 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/glance-default-external-api-0" Mar 19 10:44:01 crc kubenswrapper[4765]: I0319 10:44:01.228270 4765 generic.go:334] "Generic (PLEG): container finished" podID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerID="9f412968986b7556b9d0cd9de4886ebe503ad2f5c2b7c5677168459667cc0902" exitCode=137 Mar 19 10:44:01 crc kubenswrapper[4765]: I0319 10:44:01.228346 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6bdcb6fb-89kxv" event={"ID":"5112f66b-28fa-4500-b77b-351b8c3d0519","Type":"ContainerDied","Data":"9f412968986b7556b9d0cd9de4886ebe503ad2f5c2b7c5677168459667cc0902"} Mar 19 10:44:01 crc kubenswrapper[4765]: I0319 10:44:01.229638 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 10:44:01 crc kubenswrapper[4765]: I0319 10:44:01.229671 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/glance-default-external-api-0" Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.242504 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.242538 4765 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.700938 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.711562 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-internal-api-0" Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.891045 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.891565 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-central-agent" containerID="cri-o://ff9f488b9e6672417a0584c6eb408c902ff72e1bae985433151cc691d7fb4abe" gracePeriod=30 Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.892068 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="proxy-httpd" containerID="cri-o://4c14f90d0fe810cc9ae32558a93507aacb75e8f2ea2ce8fd1cd87768eaa3c20c" gracePeriod=30 Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.892194 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="sg-core" containerID="cri-o://cf26aa1ad96c564828e882c98b81b4060dc6aab5f9862740505e34eea3a05d80" gracePeriod=30 Mar 19 10:44:02 crc kubenswrapper[4765]: I0319 10:44:02.892452 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-notification-agent" containerID="cri-o://97fa4bb2fc6965385bfc207947dd3dcb057fd521b66ab9225e2953eed87ef104" gracePeriod=30 Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.261125 4765 generic.go:334] "Generic (PLEG): container finished" podID="200aa20d-4c08-468b-b3e9-624d474124b3" containerID="4c14f90d0fe810cc9ae32558a93507aacb75e8f2ea2ce8fd1cd87768eaa3c20c" exitCode=0 Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.261162 4765 generic.go:334] "Generic (PLEG): container finished" podID="200aa20d-4c08-468b-b3e9-624d474124b3" containerID="cf26aa1ad96c564828e882c98b81b4060dc6aab5f9862740505e34eea3a05d80" exitCode=2 Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.261170 4765 generic.go:334] "Generic (PLEG): container finished" podID="200aa20d-4c08-468b-b3e9-624d474124b3" containerID="97fa4bb2fc6965385bfc207947dd3dcb057fd521b66ab9225e2953eed87ef104" exitCode=0 Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.261806 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerDied","Data":"4c14f90d0fe810cc9ae32558a93507aacb75e8f2ea2ce8fd1cd87768eaa3c20c"} Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.261838 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerDied","Data":"cf26aa1ad96c564828e882c98b81b4060dc6aab5f9862740505e34eea3a05d80"} Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.261849 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerDied","Data":"97fa4bb2fc6965385bfc207947dd3dcb057fd521b66ab9225e2953eed87ef104"} Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.577382 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 10:44:03 crc kubenswrapper[4765]: I0319 10:44:03.577577 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/glance-default-external-api-0" Mar 19 10:44:04 crc kubenswrapper[4765]: I0319 10:44:04.276932 4765 generic.go:334] "Generic (PLEG): container finished" podID="200aa20d-4c08-468b-b3e9-624d474124b3" containerID="ff9f488b9e6672417a0584c6eb408c902ff72e1bae985433151cc691d7fb4abe" exitCode=0 Mar 19 10:44:04 crc kubenswrapper[4765]: I0319 10:44:04.278211 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerDied","Data":"ff9f488b9e6672417a0584c6eb408c902ff72e1bae985433151cc691d7fb4abe"} Mar 19 10:44:05 crc kubenswrapper[4765]: I0319 10:44:05.072660 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/horizon-5c6bdcb6fb-89kxv" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" probeResult="failure" output="Get \"https://10.217.0.152:8443/dashboard/auth/login/?next=/dashboard/\": dial tcp 10.217.0.152:8443: connect: connection refused" Mar 19 10:44:06 crc kubenswrapper[4765]: I0319 10:44:06.730088 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/cinder-api-0" Mar 19 10:44:08 crc kubenswrapper[4765]: E0319 10:44:08.240633 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified" Mar 19 10:44:08 crc kubenswrapper[4765]: E0319 10:44:08.241118 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:nova-cell0-conductor-db-sync,Image:quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified,Command:[/bin/bash],Args:[-c /usr/local/bin/kolla_start],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CELL_NAME,Value:cell0,ValueFrom:nil,},EnvVar{Name:KOLLA_BOOTSTRAP,Value:true,ValueFrom:nil,},EnvVar{Name:KOLLA_CONFIG_STRATEGY,Value:COPY_ALWAYS,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/openstack/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:scripts,ReadOnly:false,MountPath:/var/lib/openstack/bin,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:config-data,ReadOnly:false,MountPath:/var/lib/kolla/config_files/config.json,SubPath:nova-conductor-dbsync-config.json,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:combined-ca-bundle,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z746p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42436,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:nil,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-cell0-conductor-db-sync-rvf9r_openstack(d0411109-7b7f-4013-baef-8970df3e2dbf): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 10:44:08 crc kubenswrapper[4765]: E0319 10:44:08.249331 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" podUID="d0411109-7b7f-4013-baef-8970df3e2dbf" Mar 19 10:44:08 crc kubenswrapper[4765]: E0319 10:44:08.329125 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"nova-cell0-conductor-db-sync\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-nova-conductor:current-podified\\\"\"" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" podUID="d0411109-7b7f-4013-baef-8970df3e2dbf" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.598863 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.769780 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-tls-certs\") pod \"5112f66b-28fa-4500-b77b-351b8c3d0519\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.769899 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-combined-ca-bundle\") pod \"5112f66b-28fa-4500-b77b-351b8c3d0519\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.770050 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-scripts\") pod \"5112f66b-28fa-4500-b77b-351b8c3d0519\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.770178 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-config-data\") pod \"5112f66b-28fa-4500-b77b-351b8c3d0519\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.770202 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-secret-key\") pod \"5112f66b-28fa-4500-b77b-351b8c3d0519\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.770260 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hphkq\" (UniqueName: \"kubernetes.io/projected/5112f66b-28fa-4500-b77b-351b8c3d0519-kube-api-access-hphkq\") pod \"5112f66b-28fa-4500-b77b-351b8c3d0519\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.770312 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5112f66b-28fa-4500-b77b-351b8c3d0519-logs\") pod \"5112f66b-28fa-4500-b77b-351b8c3d0519\" (UID: \"5112f66b-28fa-4500-b77b-351b8c3d0519\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.771117 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5112f66b-28fa-4500-b77b-351b8c3d0519-logs" (OuterVolumeSpecName: "logs") pod "5112f66b-28fa-4500-b77b-351b8c3d0519" (UID: "5112f66b-28fa-4500-b77b-351b8c3d0519"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.776279 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5112f66b-28fa-4500-b77b-351b8c3d0519-kube-api-access-hphkq" (OuterVolumeSpecName: "kube-api-access-hphkq") pod "5112f66b-28fa-4500-b77b-351b8c3d0519" (UID: "5112f66b-28fa-4500-b77b-351b8c3d0519"). InnerVolumeSpecName "kube-api-access-hphkq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.777471 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-secret-key" (OuterVolumeSpecName: "horizon-secret-key") pod "5112f66b-28fa-4500-b77b-351b8c3d0519" (UID: "5112f66b-28fa-4500-b77b-351b8c3d0519"). InnerVolumeSpecName "horizon-secret-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.787641 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.802991 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-config-data" (OuterVolumeSpecName: "config-data") pod "5112f66b-28fa-4500-b77b-351b8c3d0519" (UID: "5112f66b-28fa-4500-b77b-351b8c3d0519"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.809796 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-scripts" (OuterVolumeSpecName: "scripts") pod "5112f66b-28fa-4500-b77b-351b8c3d0519" (UID: "5112f66b-28fa-4500-b77b-351b8c3d0519"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.848646 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-tls-certs" (OuterVolumeSpecName: "horizon-tls-certs") pod "5112f66b-28fa-4500-b77b-351b8c3d0519" (UID: "5112f66b-28fa-4500-b77b-351b8c3d0519"). InnerVolumeSpecName "horizon-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.854254 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "5112f66b-28fa-4500-b77b-351b8c3d0519" (UID: "5112f66b-28fa-4500-b77b-351b8c3d0519"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.872995 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.873044 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/5112f66b-28fa-4500-b77b-351b8c3d0519-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.873057 4765 reconciler_common.go:293] "Volume detached for volume \"horizon-secret-key\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-secret-key\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.873075 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hphkq\" (UniqueName: \"kubernetes.io/projected/5112f66b-28fa-4500-b77b-351b8c3d0519-kube-api-access-hphkq\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.873089 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5112f66b-28fa-4500-b77b-351b8c3d0519-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.873097 4765 reconciler_common.go:293] "Volume detached for volume \"horizon-tls-certs\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-horizon-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.873105 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5112f66b-28fa-4500-b77b-351b8c3d0519-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.929836 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-5vl4n"] Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974018 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-run-httpd\") pod \"200aa20d-4c08-468b-b3e9-624d474124b3\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974377 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-sg-core-conf-yaml\") pod \"200aa20d-4c08-468b-b3e9-624d474124b3\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974424 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-combined-ca-bundle\") pod \"200aa20d-4c08-468b-b3e9-624d474124b3\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974479 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-log-httpd\") pod \"200aa20d-4c08-468b-b3e9-624d474124b3\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974479 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "200aa20d-4c08-468b-b3e9-624d474124b3" (UID: "200aa20d-4c08-468b-b3e9-624d474124b3"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974539 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4l72\" (UniqueName: \"kubernetes.io/projected/200aa20d-4c08-468b-b3e9-624d474124b3-kube-api-access-r4l72\") pod \"200aa20d-4c08-468b-b3e9-624d474124b3\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974621 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-config-data\") pod \"200aa20d-4c08-468b-b3e9-624d474124b3\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.974642 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-scripts\") pod \"200aa20d-4c08-468b-b3e9-624d474124b3\" (UID: \"200aa20d-4c08-468b-b3e9-624d474124b3\") " Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.975097 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.975206 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "200aa20d-4c08-468b-b3e9-624d474124b3" (UID: "200aa20d-4c08-468b-b3e9-624d474124b3"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.981218 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-scripts" (OuterVolumeSpecName: "scripts") pod "200aa20d-4c08-468b-b3e9-624d474124b3" (UID: "200aa20d-4c08-468b-b3e9-624d474124b3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:08 crc kubenswrapper[4765]: I0319 10:44:08.981260 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/200aa20d-4c08-468b-b3e9-624d474124b3-kube-api-access-r4l72" (OuterVolumeSpecName: "kube-api-access-r4l72") pod "200aa20d-4c08-468b-b3e9-624d474124b3" (UID: "200aa20d-4c08-468b-b3e9-624d474124b3"). InnerVolumeSpecName "kube-api-access-r4l72". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.000818 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "200aa20d-4c08-468b-b3e9-624d474124b3" (UID: "200aa20d-4c08-468b-b3e9-624d474124b3"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.042051 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "200aa20d-4c08-468b-b3e9-624d474124b3" (UID: "200aa20d-4c08-468b-b3e9-624d474124b3"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.066909 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-config-data" (OuterVolumeSpecName: "config-data") pod "200aa20d-4c08-468b-b3e9-624d474124b3" (UID: "200aa20d-4c08-468b-b3e9-624d474124b3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.077120 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.077160 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.077172 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.077187 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/200aa20d-4c08-468b-b3e9-624d474124b3-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.077198 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/200aa20d-4c08-468b-b3e9-624d474124b3-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.077209 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4l72\" (UniqueName: \"kubernetes.io/projected/200aa20d-4c08-468b-b3e9-624d474124b3-kube-api-access-r4l72\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.335639 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" event={"ID":"183750a8-660e-41fd-85f8-6deb34af3c2c","Type":"ContainerStarted","Data":"dc6402d319530b9a9a7ff815c27d6f5ca35cb7a1c142ceeab9c264a9c9e6d67e"} Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.339589 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/horizon-5c6bdcb6fb-89kxv" event={"ID":"5112f66b-28fa-4500-b77b-351b8c3d0519","Type":"ContainerDied","Data":"1dab967f1829d7055dc9c1ebc1bbad7a17f5e9bc1937b2222904e094518abd77"} Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.339680 4765 scope.go:117] "RemoveContainer" containerID="1a7ad5eca76b21850fa11fd220a31f1fb2463a805be4f0068170a81bfea7d086" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.339610 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/horizon-5c6bdcb6fb-89kxv" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.342562 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"200aa20d-4c08-468b-b3e9-624d474124b3","Type":"ContainerDied","Data":"7cd22f243003b4007e330747994313a4db66277b89a504c50e4e2f72d26f0951"} Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.342625 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.388441 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/horizon-5c6bdcb6fb-89kxv"] Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.411610 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/horizon-5c6bdcb6fb-89kxv"] Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.422017 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.429528 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.465038 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:09 crc kubenswrapper[4765]: E0319 10:44:09.466172 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="sg-core" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466192 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="sg-core" Mar 19 10:44:09 crc kubenswrapper[4765]: E0319 10:44:09.466225 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-notification-agent" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466234 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-notification-agent" Mar 19 10:44:09 crc kubenswrapper[4765]: E0319 10:44:09.466267 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon-log" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466277 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon-log" Mar 19 10:44:09 crc kubenswrapper[4765]: E0319 10:44:09.466301 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466309 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" Mar 19 10:44:09 crc kubenswrapper[4765]: E0319 10:44:09.466324 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="proxy-httpd" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466331 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="proxy-httpd" Mar 19 10:44:09 crc kubenswrapper[4765]: E0319 10:44:09.466353 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-central-agent" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466360 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-central-agent" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466858 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="sg-core" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466910 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-notification-agent" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466932 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.466955 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="ceilometer-central-agent" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.467004 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" containerName="proxy-httpd" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.467029 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" containerName="horizon-log" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.476212 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.481942 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.489217 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.496454 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.563866 4765 scope.go:117] "RemoveContainer" containerID="9f412968986b7556b9d0cd9de4886ebe503ad2f5c2b7c5677168459667cc0902" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.585306 4765 scope.go:117] "RemoveContainer" containerID="4c14f90d0fe810cc9ae32558a93507aacb75e8f2ea2ce8fd1cd87768eaa3c20c" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.587392 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-log-httpd\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.587437 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-run-httpd\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.587500 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-csgrs\" (UniqueName: \"kubernetes.io/projected/bdadca91-8f46-407e-ab76-30cb026fe08a-kube-api-access-csgrs\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.587634 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-scripts\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.587924 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-config-data\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.588058 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.588099 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.603019 4765 scope.go:117] "RemoveContainer" containerID="cf26aa1ad96c564828e882c98b81b4060dc6aab5f9862740505e34eea3a05d80" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.630027 4765 scope.go:117] "RemoveContainer" containerID="97fa4bb2fc6965385bfc207947dd3dcb057fd521b66ab9225e2953eed87ef104" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.654784 4765 scope.go:117] "RemoveContainer" containerID="ff9f488b9e6672417a0584c6eb408c902ff72e1bae985433151cc691d7fb4abe" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.690369 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-config-data\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.690442 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.690469 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.690507 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-log-httpd\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.690528 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-run-httpd\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.690568 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-csgrs\" (UniqueName: \"kubernetes.io/projected/bdadca91-8f46-407e-ab76-30cb026fe08a-kube-api-access-csgrs\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.690600 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-scripts\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.691001 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-run-httpd\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.691082 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-log-httpd\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.694939 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-scripts\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.695742 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.696145 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.697040 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-config-data\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.723712 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-csgrs\" (UniqueName: \"kubernetes.io/projected/bdadca91-8f46-407e-ab76-30cb026fe08a-kube-api-access-csgrs\") pod \"ceilometer-0\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " pod="openstack/ceilometer-0" Mar 19 10:44:09 crc kubenswrapper[4765]: I0319 10:44:09.803188 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:10 crc kubenswrapper[4765]: I0319 10:44:10.281569 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:10 crc kubenswrapper[4765]: W0319 10:44:10.291778 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbdadca91_8f46_407e_ab76_30cb026fe08a.slice/crio-e867426dafb049825702f8b101bdaf8d948ad2b8c5f4e355717d09e71ddf3176 WatchSource:0}: Error finding container e867426dafb049825702f8b101bdaf8d948ad2b8c5f4e355717d09e71ddf3176: Status 404 returned error can't find the container with id e867426dafb049825702f8b101bdaf8d948ad2b8c5f4e355717d09e71ddf3176 Mar 19 10:44:10 crc kubenswrapper[4765]: I0319 10:44:10.354458 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" event={"ID":"183750a8-660e-41fd-85f8-6deb34af3c2c","Type":"ContainerStarted","Data":"c0ef87341cad1856dc8776dcbfd142ffab12f6f170e88b3861bb452ff00eda65"} Mar 19 10:44:10 crc kubenswrapper[4765]: I0319 10:44:10.365053 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="200aa20d-4c08-468b-b3e9-624d474124b3" path="/var/lib/kubelet/pods/200aa20d-4c08-468b-b3e9-624d474124b3/volumes" Mar 19 10:44:10 crc kubenswrapper[4765]: I0319 10:44:10.366234 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5112f66b-28fa-4500-b77b-351b8c3d0519" path="/var/lib/kubelet/pods/5112f66b-28fa-4500-b77b-351b8c3d0519/volumes" Mar 19 10:44:10 crc kubenswrapper[4765]: I0319 10:44:10.366927 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerStarted","Data":"e867426dafb049825702f8b101bdaf8d948ad2b8c5f4e355717d09e71ddf3176"} Mar 19 10:44:10 crc kubenswrapper[4765]: I0319 10:44:10.373264 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" podStartSLOduration=9.368331299 podStartE2EDuration="10.373245245s" podCreationTimestamp="2026-03-19 10:44:00 +0000 UTC" firstStartedPulling="2026-03-19 10:44:08.937322251 +0000 UTC m=+1347.286267793" lastFinishedPulling="2026-03-19 10:44:09.942236197 +0000 UTC m=+1348.291181739" observedRunningTime="2026-03-19 10:44:10.367524805 +0000 UTC m=+1348.716470347" watchObservedRunningTime="2026-03-19 10:44:10.373245245 +0000 UTC m=+1348.722190787" Mar 19 10:44:11 crc kubenswrapper[4765]: I0319 10:44:11.166141 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:11 crc kubenswrapper[4765]: I0319 10:44:11.369984 4765 generic.go:334] "Generic (PLEG): container finished" podID="183750a8-660e-41fd-85f8-6deb34af3c2c" containerID="c0ef87341cad1856dc8776dcbfd142ffab12f6f170e88b3861bb452ff00eda65" exitCode=0 Mar 19 10:44:11 crc kubenswrapper[4765]: I0319 10:44:11.371140 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" event={"ID":"183750a8-660e-41fd-85f8-6deb34af3c2c","Type":"ContainerDied","Data":"c0ef87341cad1856dc8776dcbfd142ffab12f6f170e88b3861bb452ff00eda65"} Mar 19 10:44:12 crc kubenswrapper[4765]: I0319 10:44:12.381576 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerStarted","Data":"2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb"} Mar 19 10:44:12 crc kubenswrapper[4765]: I0319 10:44:12.743080 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" Mar 19 10:44:12 crc kubenswrapper[4765]: I0319 10:44:12.847357 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2922\" (UniqueName: \"kubernetes.io/projected/183750a8-660e-41fd-85f8-6deb34af3c2c-kube-api-access-r2922\") pod \"183750a8-660e-41fd-85f8-6deb34af3c2c\" (UID: \"183750a8-660e-41fd-85f8-6deb34af3c2c\") " Mar 19 10:44:12 crc kubenswrapper[4765]: I0319 10:44:12.873160 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/183750a8-660e-41fd-85f8-6deb34af3c2c-kube-api-access-r2922" (OuterVolumeSpecName: "kube-api-access-r2922") pod "183750a8-660e-41fd-85f8-6deb34af3c2c" (UID: "183750a8-660e-41fd-85f8-6deb34af3c2c"). InnerVolumeSpecName "kube-api-access-r2922". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:12 crc kubenswrapper[4765]: I0319 10:44:12.950288 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r2922\" (UniqueName: \"kubernetes.io/projected/183750a8-660e-41fd-85f8-6deb34af3c2c-kube-api-access-r2922\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:13 crc kubenswrapper[4765]: I0319 10:44:13.393724 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" event={"ID":"183750a8-660e-41fd-85f8-6deb34af3c2c","Type":"ContainerDied","Data":"dc6402d319530b9a9a7ff815c27d6f5ca35cb7a1c142ceeab9c264a9c9e6d67e"} Mar 19 10:44:13 crc kubenswrapper[4765]: I0319 10:44:13.393776 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="dc6402d319530b9a9a7ff815c27d6f5ca35cb7a1c142ceeab9c264a9c9e6d67e" Mar 19 10:44:13 crc kubenswrapper[4765]: I0319 10:44:13.393838 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565284-5vl4n" Mar 19 10:44:13 crc kubenswrapper[4765]: I0319 10:44:13.446738 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-cl6lh"] Mar 19 10:44:13 crc kubenswrapper[4765]: I0319 10:44:13.457873 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565278-cl6lh"] Mar 19 10:44:14 crc kubenswrapper[4765]: I0319 10:44:14.369558 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="141a4fdb-54ad-466b-90d6-b9209e18b1a7" path="/var/lib/kubelet/pods/141a4fdb-54ad-466b-90d6-b9209e18b1a7/volumes" Mar 19 10:44:14 crc kubenswrapper[4765]: I0319 10:44:14.404870 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerStarted","Data":"63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534"} Mar 19 10:44:23 crc kubenswrapper[4765]: I0319 10:44:23.492187 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerStarted","Data":"d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f"} Mar 19 10:44:27 crc kubenswrapper[4765]: I0319 10:44:27.523840 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" event={"ID":"d0411109-7b7f-4013-baef-8970df3e2dbf","Type":"ContainerStarted","Data":"8f69d0cf3afd3b05c0afe19bba5388841ccaf130da23fbcd1abebed1314ca46b"} Mar 19 10:44:27 crc kubenswrapper[4765]: I0319 10:44:27.539952 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" podStartSLOduration=2.922336797 podStartE2EDuration="33.539936069s" podCreationTimestamp="2026-03-19 10:43:54 +0000 UTC" firstStartedPulling="2026-03-19 10:43:55.428339524 +0000 UTC m=+1333.777285066" lastFinishedPulling="2026-03-19 10:44:26.045938796 +0000 UTC m=+1364.394884338" observedRunningTime="2026-03-19 10:44:27.538752596 +0000 UTC m=+1365.887698138" watchObservedRunningTime="2026-03-19 10:44:27.539936069 +0000 UTC m=+1365.888881611" Mar 19 10:44:29 crc kubenswrapper[4765]: I0319 10:44:29.544341 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerStarted","Data":"005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72"} Mar 19 10:44:29 crc kubenswrapper[4765]: I0319 10:44:29.544864 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-central-agent" containerID="cri-o://2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb" gracePeriod=30 Mar 19 10:44:29 crc kubenswrapper[4765]: I0319 10:44:29.545045 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 10:44:29 crc kubenswrapper[4765]: I0319 10:44:29.545166 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="proxy-httpd" containerID="cri-o://005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72" gracePeriod=30 Mar 19 10:44:29 crc kubenswrapper[4765]: I0319 10:44:29.545250 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="sg-core" containerID="cri-o://d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f" gracePeriod=30 Mar 19 10:44:29 crc kubenswrapper[4765]: I0319 10:44:29.545214 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-notification-agent" containerID="cri-o://63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534" gracePeriod=30 Mar 19 10:44:29 crc kubenswrapper[4765]: I0319 10:44:29.572411 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.287182638 podStartE2EDuration="20.572394963s" podCreationTimestamp="2026-03-19 10:44:09 +0000 UTC" firstStartedPulling="2026-03-19 10:44:10.294155447 +0000 UTC m=+1348.643100989" lastFinishedPulling="2026-03-19 10:44:28.579367772 +0000 UTC m=+1366.928313314" observedRunningTime="2026-03-19 10:44:29.569696317 +0000 UTC m=+1367.918641859" watchObservedRunningTime="2026-03-19 10:44:29.572394963 +0000 UTC m=+1367.921340505" Mar 19 10:44:30 crc kubenswrapper[4765]: I0319 10:44:30.556674 4765 generic.go:334] "Generic (PLEG): container finished" podID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerID="005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72" exitCode=0 Mar 19 10:44:30 crc kubenswrapper[4765]: I0319 10:44:30.556708 4765 generic.go:334] "Generic (PLEG): container finished" podID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerID="d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f" exitCode=2 Mar 19 10:44:30 crc kubenswrapper[4765]: I0319 10:44:30.556717 4765 generic.go:334] "Generic (PLEG): container finished" podID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerID="2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb" exitCode=0 Mar 19 10:44:30 crc kubenswrapper[4765]: I0319 10:44:30.556736 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerDied","Data":"005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72"} Mar 19 10:44:30 crc kubenswrapper[4765]: I0319 10:44:30.556761 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerDied","Data":"d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f"} Mar 19 10:44:30 crc kubenswrapper[4765]: I0319 10:44:30.556772 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerDied","Data":"2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb"} Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.021935 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.219402 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-sg-core-conf-yaml\") pod \"bdadca91-8f46-407e-ab76-30cb026fe08a\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.219796 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-run-httpd\") pod \"bdadca91-8f46-407e-ab76-30cb026fe08a\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.219920 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-scripts\") pod \"bdadca91-8f46-407e-ab76-30cb026fe08a\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.219973 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-csgrs\" (UniqueName: \"kubernetes.io/projected/bdadca91-8f46-407e-ab76-30cb026fe08a-kube-api-access-csgrs\") pod \"bdadca91-8f46-407e-ab76-30cb026fe08a\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.220091 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-combined-ca-bundle\") pod \"bdadca91-8f46-407e-ab76-30cb026fe08a\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.220171 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-log-httpd\") pod \"bdadca91-8f46-407e-ab76-30cb026fe08a\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.220202 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-config-data\") pod \"bdadca91-8f46-407e-ab76-30cb026fe08a\" (UID: \"bdadca91-8f46-407e-ab76-30cb026fe08a\") " Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.224520 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "bdadca91-8f46-407e-ab76-30cb026fe08a" (UID: "bdadca91-8f46-407e-ab76-30cb026fe08a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.224748 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "bdadca91-8f46-407e-ab76-30cb026fe08a" (UID: "bdadca91-8f46-407e-ab76-30cb026fe08a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.226415 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bdadca91-8f46-407e-ab76-30cb026fe08a-kube-api-access-csgrs" (OuterVolumeSpecName: "kube-api-access-csgrs") pod "bdadca91-8f46-407e-ab76-30cb026fe08a" (UID: "bdadca91-8f46-407e-ab76-30cb026fe08a"). InnerVolumeSpecName "kube-api-access-csgrs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.228594 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-scripts" (OuterVolumeSpecName: "scripts") pod "bdadca91-8f46-407e-ab76-30cb026fe08a" (UID: "bdadca91-8f46-407e-ab76-30cb026fe08a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.247832 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "bdadca91-8f46-407e-ab76-30cb026fe08a" (UID: "bdadca91-8f46-407e-ab76-30cb026fe08a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.295400 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "bdadca91-8f46-407e-ab76-30cb026fe08a" (UID: "bdadca91-8f46-407e-ab76-30cb026fe08a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.316002 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-config-data" (OuterVolumeSpecName: "config-data") pod "bdadca91-8f46-407e-ab76-30cb026fe08a" (UID: "bdadca91-8f46-407e-ab76-30cb026fe08a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.322387 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.322419 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.322430 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.322440 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.322450 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/bdadca91-8f46-407e-ab76-30cb026fe08a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.322460 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bdadca91-8f46-407e-ab76-30cb026fe08a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.322472 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-csgrs\" (UniqueName: \"kubernetes.io/projected/bdadca91-8f46-407e-ab76-30cb026fe08a-kube-api-access-csgrs\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.568504 4765 generic.go:334] "Generic (PLEG): container finished" podID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerID="63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534" exitCode=0 Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.568555 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerDied","Data":"63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534"} Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.568584 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"bdadca91-8f46-407e-ab76-30cb026fe08a","Type":"ContainerDied","Data":"e867426dafb049825702f8b101bdaf8d948ad2b8c5f4e355717d09e71ddf3176"} Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.568606 4765 scope.go:117] "RemoveContainer" containerID="005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.570066 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.594947 4765 scope.go:117] "RemoveContainer" containerID="d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.617217 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.617602 4765 scope.go:117] "RemoveContainer" containerID="63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.633793 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.642923 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.643294 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="sg-core" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643310 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="sg-core" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.643329 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="proxy-httpd" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643336 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="proxy-httpd" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.643347 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-notification-agent" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643354 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-notification-agent" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.643373 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-central-agent" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643379 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-central-agent" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.643389 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="183750a8-660e-41fd-85f8-6deb34af3c2c" containerName="oc" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643394 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="183750a8-660e-41fd-85f8-6deb34af3c2c" containerName="oc" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643548 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="sg-core" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643563 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="183750a8-660e-41fd-85f8-6deb34af3c2c" containerName="oc" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643573 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="proxy-httpd" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643582 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-central-agent" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.643596 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" containerName="ceilometer-notification-agent" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.647044 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.650837 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.650934 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.664734 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.685863 4765 scope.go:117] "RemoveContainer" containerID="2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.723547 4765 scope.go:117] "RemoveContainer" containerID="005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.724051 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72\": container with ID starting with 005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72 not found: ID does not exist" containerID="005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.724088 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72"} err="failed to get container status \"005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72\": rpc error: code = NotFound desc = could not find container \"005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72\": container with ID starting with 005261572a1b5c3f97c8fbda80c183916b2d32b17356130c040a169ca3ad4c72 not found: ID does not exist" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.724114 4765 scope.go:117] "RemoveContainer" containerID="d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.724346 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f\": container with ID starting with d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f not found: ID does not exist" containerID="d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.724375 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f"} err="failed to get container status \"d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f\": rpc error: code = NotFound desc = could not find container \"d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f\": container with ID starting with d1e2a458ee1258b40e7b454db8bb60590f1a502ae8c8295e8b06895b3e2e326f not found: ID does not exist" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.724395 4765 scope.go:117] "RemoveContainer" containerID="63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.724583 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534\": container with ID starting with 63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534 not found: ID does not exist" containerID="63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.724606 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534"} err="failed to get container status \"63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534\": rpc error: code = NotFound desc = could not find container \"63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534\": container with ID starting with 63603b1a5923369e5bd4bb977e79e8b5566ccb3ef90463849c302f9e815a9534 not found: ID does not exist" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.724625 4765 scope.go:117] "RemoveContainer" containerID="2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb" Mar 19 10:44:31 crc kubenswrapper[4765]: E0319 10:44:31.724889 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb\": container with ID starting with 2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb not found: ID does not exist" containerID="2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.724914 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb"} err="failed to get container status \"2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb\": rpc error: code = NotFound desc = could not find container \"2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb\": container with ID starting with 2fb337270c737852a0ee68b737b98aee76ab8f4410dd1b93e8bf1cd484dae6eb not found: ID does not exist" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.735058 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rll9\" (UniqueName: \"kubernetes.io/projected/a0afe23f-4c2a-4e60-9f57-189009a620f8-kube-api-access-8rll9\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.735111 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-config-data\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.735199 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.735230 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-log-httpd\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.735255 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.735271 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-scripts\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.735372 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-run-httpd\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836479 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-run-httpd\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836559 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8rll9\" (UniqueName: \"kubernetes.io/projected/a0afe23f-4c2a-4e60-9f57-189009a620f8-kube-api-access-8rll9\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836599 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-config-data\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836717 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836761 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-log-httpd\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836790 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836816 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-scripts\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.836935 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-run-httpd\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.837402 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-log-httpd\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.842071 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.842288 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-config-data\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.849070 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.853118 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-scripts\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:31 crc kubenswrapper[4765]: I0319 10:44:31.860612 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rll9\" (UniqueName: \"kubernetes.io/projected/a0afe23f-4c2a-4e60-9f57-189009a620f8-kube-api-access-8rll9\") pod \"ceilometer-0\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " pod="openstack/ceilometer-0" Mar 19 10:44:32 crc kubenswrapper[4765]: I0319 10:44:32.016711 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:44:32 crc kubenswrapper[4765]: I0319 10:44:32.367716 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bdadca91-8f46-407e-ab76-30cb026fe08a" path="/var/lib/kubelet/pods/bdadca91-8f46-407e-ab76-30cb026fe08a/volumes" Mar 19 10:44:32 crc kubenswrapper[4765]: I0319 10:44:32.468041 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:44:32 crc kubenswrapper[4765]: W0319 10:44:32.469390 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda0afe23f_4c2a_4e60_9f57_189009a620f8.slice/crio-86899b78b87e834b4fc89c88af757294e2c6780beac29bb4f1a342f1cf7f60a7 WatchSource:0}: Error finding container 86899b78b87e834b4fc89c88af757294e2c6780beac29bb4f1a342f1cf7f60a7: Status 404 returned error can't find the container with id 86899b78b87e834b4fc89c88af757294e2c6780beac29bb4f1a342f1cf7f60a7 Mar 19 10:44:32 crc kubenswrapper[4765]: I0319 10:44:32.577756 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerStarted","Data":"86899b78b87e834b4fc89c88af757294e2c6780beac29bb4f1a342f1cf7f60a7"} Mar 19 10:44:34 crc kubenswrapper[4765]: I0319 10:44:34.601805 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerStarted","Data":"7fd7c5b075326d2630e8509d2874b4a7931bd2d4fb28c1d2328de2daf3720f15"} Mar 19 10:44:35 crc kubenswrapper[4765]: I0319 10:44:35.611699 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerStarted","Data":"eaa3319bac0f28efb9983e12dd4ff24efc0e820215132e22f6f3366ccf6479f7"} Mar 19 10:44:37 crc kubenswrapper[4765]: I0319 10:44:37.629985 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerStarted","Data":"a28d6020624e0ff33f61930cb465b77be95b1eea5da41229812c2eaf07f0fd15"} Mar 19 10:44:39 crc kubenswrapper[4765]: I0319 10:44:39.658266 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerStarted","Data":"3a42f834745602a6c5e722e1de4dee0c09470d9cd8a7dbc46fa8cea16f066c33"} Mar 19 10:44:39 crc kubenswrapper[4765]: I0319 10:44:39.658982 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 10:44:50 crc kubenswrapper[4765]: I0319 10:44:50.746015 4765 generic.go:334] "Generic (PLEG): container finished" podID="d0411109-7b7f-4013-baef-8970df3e2dbf" containerID="8f69d0cf3afd3b05c0afe19bba5388841ccaf130da23fbcd1abebed1314ca46b" exitCode=0 Mar 19 10:44:50 crc kubenswrapper[4765]: I0319 10:44:50.746096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" event={"ID":"d0411109-7b7f-4013-baef-8970df3e2dbf","Type":"ContainerDied","Data":"8f69d0cf3afd3b05c0afe19bba5388841ccaf130da23fbcd1abebed1314ca46b"} Mar 19 10:44:50 crc kubenswrapper[4765]: I0319 10:44:50.767853 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=13.632880416999999 podStartE2EDuration="19.767833613s" podCreationTimestamp="2026-03-19 10:44:31 +0000 UTC" firstStartedPulling="2026-03-19 10:44:32.471556337 +0000 UTC m=+1370.820501879" lastFinishedPulling="2026-03-19 10:44:38.606509533 +0000 UTC m=+1376.955455075" observedRunningTime="2026-03-19 10:44:39.681823943 +0000 UTC m=+1378.030769485" watchObservedRunningTime="2026-03-19 10:44:50.767833613 +0000 UTC m=+1389.116779155" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.225096 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.339203 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-config-data\") pod \"d0411109-7b7f-4013-baef-8970df3e2dbf\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.339260 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-combined-ca-bundle\") pod \"d0411109-7b7f-4013-baef-8970df3e2dbf\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.339398 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-scripts\") pod \"d0411109-7b7f-4013-baef-8970df3e2dbf\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.339429 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z746p\" (UniqueName: \"kubernetes.io/projected/d0411109-7b7f-4013-baef-8970df3e2dbf-kube-api-access-z746p\") pod \"d0411109-7b7f-4013-baef-8970df3e2dbf\" (UID: \"d0411109-7b7f-4013-baef-8970df3e2dbf\") " Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.348016 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-scripts" (OuterVolumeSpecName: "scripts") pod "d0411109-7b7f-4013-baef-8970df3e2dbf" (UID: "d0411109-7b7f-4013-baef-8970df3e2dbf"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.348112 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0411109-7b7f-4013-baef-8970df3e2dbf-kube-api-access-z746p" (OuterVolumeSpecName: "kube-api-access-z746p") pod "d0411109-7b7f-4013-baef-8970df3e2dbf" (UID: "d0411109-7b7f-4013-baef-8970df3e2dbf"). InnerVolumeSpecName "kube-api-access-z746p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.410507 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-config-data" (OuterVolumeSpecName: "config-data") pod "d0411109-7b7f-4013-baef-8970df3e2dbf" (UID: "d0411109-7b7f-4013-baef-8970df3e2dbf"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.417017 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d0411109-7b7f-4013-baef-8970df3e2dbf" (UID: "d0411109-7b7f-4013-baef-8970df3e2dbf"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.445048 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.445097 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.445106 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/d0411109-7b7f-4013-baef-8970df3e2dbf-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.445115 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z746p\" (UniqueName: \"kubernetes.io/projected/d0411109-7b7f-4013-baef-8970df3e2dbf-kube-api-access-z746p\") on node \"crc\" DevicePath \"\"" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.765993 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" event={"ID":"d0411109-7b7f-4013-baef-8970df3e2dbf","Type":"ContainerDied","Data":"11952f85695122fda5b1876001bdb567afc129c9bfef41bc331ba3acb65f30bf"} Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.766045 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11952f85695122fda5b1876001bdb567afc129c9bfef41bc331ba3acb65f30bf" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.766178 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-db-sync-rvf9r" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.885233 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 10:44:52 crc kubenswrapper[4765]: E0319 10:44:52.885637 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0411109-7b7f-4013-baef-8970df3e2dbf" containerName="nova-cell0-conductor-db-sync" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.885656 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0411109-7b7f-4013-baef-8970df3e2dbf" containerName="nova-cell0-conductor-db-sync" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.885862 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0411109-7b7f-4013-baef-8970df3e2dbf" containerName="nova-cell0-conductor-db-sync" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.886472 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.888826 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-nova-dockercfg-t8r4j" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.889016 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-conductor-config-data" Mar 19 10:44:52 crc kubenswrapper[4765]: I0319 10:44:52.898803 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.054944 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.055052 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bf6s\" (UniqueName: \"kubernetes.io/projected/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-kube-api-access-4bf6s\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.055387 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.158478 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.158532 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bf6s\" (UniqueName: \"kubernetes.io/projected/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-kube-api-access-4bf6s\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.158895 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.163511 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-config-data\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.177673 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-combined-ca-bundle\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.181509 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bf6s\" (UniqueName: \"kubernetes.io/projected/3eb0de8e-0a7f-4324-8195-2bab8419c2ba-kube-api-access-4bf6s\") pod \"nova-cell0-conductor-0\" (UID: \"3eb0de8e-0a7f-4324-8195-2bab8419c2ba\") " pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.217086 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.684235 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-conductor-0"] Mar 19 10:44:53 crc kubenswrapper[4765]: W0319 10:44:53.690051 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3eb0de8e_0a7f_4324_8195_2bab8419c2ba.slice/crio-548cf2e47fd54ce6a48771d47e9e5d50721348c7a4bb9c3a4a354af3216b38f9 WatchSource:0}: Error finding container 548cf2e47fd54ce6a48771d47e9e5d50721348c7a4bb9c3a4a354af3216b38f9: Status 404 returned error can't find the container with id 548cf2e47fd54ce6a48771d47e9e5d50721348c7a4bb9c3a4a354af3216b38f9 Mar 19 10:44:53 crc kubenswrapper[4765]: I0319 10:44:53.777719 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3eb0de8e-0a7f-4324-8195-2bab8419c2ba","Type":"ContainerStarted","Data":"548cf2e47fd54ce6a48771d47e9e5d50721348c7a4bb9c3a4a354af3216b38f9"} Mar 19 10:44:54 crc kubenswrapper[4765]: I0319 10:44:54.788586 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-conductor-0" event={"ID":"3eb0de8e-0a7f-4324-8195-2bab8419c2ba","Type":"ContainerStarted","Data":"99cc2db2e4d05a196900c99fa93104586f6a7e811b830b16cc12c248e5d2ae94"} Mar 19 10:44:54 crc kubenswrapper[4765]: I0319 10:44:54.789048 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:54 crc kubenswrapper[4765]: I0319 10:44:54.815233 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-conductor-0" podStartSLOduration=2.815208238 podStartE2EDuration="2.815208238s" podCreationTimestamp="2026-03-19 10:44:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:44:54.807581764 +0000 UTC m=+1393.156527306" watchObservedRunningTime="2026-03-19 10:44:54.815208238 +0000 UTC m=+1393.164153820" Mar 19 10:44:57 crc kubenswrapper[4765]: I0319 10:44:57.022236 4765 scope.go:117] "RemoveContainer" containerID="b2647d1bbe6b3a8341f4c6c0546cb3540a72d35faa02bcf3f0607dd968fb223b" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.246025 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell0-conductor-0" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.774792 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell0-cell-mapping-7rjkg"] Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.776668 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.779805 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-scripts" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.780304 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell0-manage-config-data" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.786815 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7rjkg"] Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.896566 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-config-data\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.897049 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xhlls\" (UniqueName: \"kubernetes.io/projected/313b3021-c103-47ac-9cb5-b38e971d22fd-kube-api-access-xhlls\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.897196 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.897265 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-scripts\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.972461 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.973665 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.984923 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.986811 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:44:58 crc kubenswrapper[4765]: I0319 10:44:58.996711 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.006986 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008344 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-config-data\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008382 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008412 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-scripts\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008476 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n9fbb\" (UniqueName: \"kubernetes.io/projected/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-kube-api-access-n9fbb\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008514 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008544 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-config-data\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008588 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xhlls\" (UniqueName: \"kubernetes.io/projected/313b3021-c103-47ac-9cb5-b38e971d22fd-kube-api-access-xhlls\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008626 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-logs\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008673 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45qvh\" (UniqueName: \"kubernetes.io/projected/f13d82bb-947c-43a3-8681-00c59cb179d9-kube-api-access-45qvh\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008755 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.008781 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.017602 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.017869 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-scripts\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.018006 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-combined-ca-bundle\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.018257 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.023009 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-config-data\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.030616 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xhlls\" (UniqueName: \"kubernetes.io/projected/313b3021-c103-47ac-9cb5-b38e971d22fd-kube-api-access-xhlls\") pod \"nova-cell0-cell-mapping-7rjkg\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.081063 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.082936 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.088782 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.113615 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.114661 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d924f586-eb76-40c9-a8d7-10f9f9935511-logs\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.114771 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-config-data\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.114831 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.114865 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.114909 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-config-data\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.114933 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.117988 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.122296 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-config-data\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.124614 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n9fbb\" (UniqueName: \"kubernetes.io/projected/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-kube-api-access-n9fbb\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.124675 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6dqp\" (UniqueName: \"kubernetes.io/projected/d924f586-eb76-40c9-a8d7-10f9f9935511-kube-api-access-g6dqp\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.124744 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.124889 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-logs\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.125009 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45qvh\" (UniqueName: \"kubernetes.io/projected/f13d82bb-947c-43a3-8681-00c59cb179d9-kube-api-access-45qvh\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.125658 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-logs\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.126901 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.127952 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-config-data\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.130480 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.155596 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45qvh\" (UniqueName: \"kubernetes.io/projected/f13d82bb-947c-43a3-8681-00c59cb179d9-kube-api-access-45qvh\") pod \"nova-scheduler-0\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.183855 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n9fbb\" (UniqueName: \"kubernetes.io/projected/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-kube-api-access-n9fbb\") pod \"nova-api-0\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.226244 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d924f586-eb76-40c9-a8d7-10f9f9935511-logs\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.226312 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-config-data\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.226335 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.226390 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6dqp\" (UniqueName: \"kubernetes.io/projected/d924f586-eb76-40c9-a8d7-10f9f9935511-kube-api-access-g6dqp\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.227120 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d924f586-eb76-40c9-a8d7-10f9f9935511-logs\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.227860 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.229091 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.241777 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-config-data\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.245453 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.252810 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.253269 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6dqp\" (UniqueName: \"kubernetes.io/projected/d924f586-eb76-40c9-a8d7-10f9f9935511-kube-api-access-g6dqp\") pod \"nova-metadata-0\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.284041 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.292083 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.301644 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.308411 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-26rbh"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.310348 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.317705 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-26rbh"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.327845 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.327930 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.327985 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4bl5d\" (UniqueName: \"kubernetes.io/projected/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-kube-api-access-4bl5d\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.328046 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.328101 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.328126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.328178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-config\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.328242 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.328276 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qf7\" (UniqueName: \"kubernetes.io/projected/b7cd8e59-b93c-4145-b7bc-c529a915598d-kube-api-access-v9qf7\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.336334 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.436452 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.436942 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.437180 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.437255 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-config\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.437302 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.437325 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qf7\" (UniqueName: \"kubernetes.io/projected/b7cd8e59-b93c-4145-b7bc-c529a915598d-kube-api-access-v9qf7\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.437560 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.437600 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.437662 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4bl5d\" (UniqueName: \"kubernetes.io/projected/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-kube-api-access-4bl5d\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.439272 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-config\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.440944 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-sb\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.441019 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-svc\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.442237 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-nb\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.443352 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.445671 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-swift-storage-0\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.457445 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.457831 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qf7\" (UniqueName: \"kubernetes.io/projected/b7cd8e59-b93c-4145-b7bc-c529a915598d-kube-api-access-v9qf7\") pod \"nova-cell1-novncproxy-0\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.469518 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4bl5d\" (UniqueName: \"kubernetes.io/projected/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-kube-api-access-4bl5d\") pod \"dnsmasq-dns-bccf8f775-26rbh\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.656012 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.669379 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.806216 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell0-cell-mapping-7rjkg"] Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.863164 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7rjkg" event={"ID":"313b3021-c103-47ac-9cb5-b38e971d22fd","Type":"ContainerStarted","Data":"b1b11fe0913bfb11371e2e5db352daee604039b68c3936b48d2804630a0461d7"} Mar 19 10:44:59 crc kubenswrapper[4765]: W0319 10:44:59.947409 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf13d82bb_947c_43a3_8681_00c59cb179d9.slice/crio-c6a49a35bb52caedf44c4612ee987a233774bb01e11f636f401a616684ba3b35 WatchSource:0}: Error finding container c6a49a35bb52caedf44c4612ee987a233774bb01e11f636f401a616684ba3b35: Status 404 returned error can't find the container with id c6a49a35bb52caedf44c4612ee987a233774bb01e11f636f401a616684ba3b35 Mar 19 10:44:59 crc kubenswrapper[4765]: I0319 10:44:59.947715 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.004629 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fm985"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.010529 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.015004 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.029597 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-scripts" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.044575 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:00 crc kubenswrapper[4765]: W0319 10:45:00.046187 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd924f586_eb76_40c9_a8d7_10f9f9935511.slice/crio-df962087ef50707c8839559b65025ac52b623036a637ed07e21a00959570d752 WatchSource:0}: Error finding container df962087ef50707c8839559b65025ac52b623036a637ed07e21a00959570d752: Status 404 returned error can't find the container with id df962087ef50707c8839559b65025ac52b623036a637ed07e21a00959570d752 Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.054369 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgssf\" (UniqueName: \"kubernetes.io/projected/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-kube-api-access-lgssf\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.054415 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-scripts\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.054513 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.054788 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-config-data\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.065172 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fm985"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.158770 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgssf\" (UniqueName: \"kubernetes.io/projected/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-kube-api-access-lgssf\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.158823 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-scripts\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.158860 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.158985 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-config-data\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.187776 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-config-data\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.192749 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-scripts\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.193406 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-combined-ca-bundle\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.198291 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgssf\" (UniqueName: \"kubernetes.io/projected/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-kube-api-access-lgssf\") pod \"nova-cell1-conductor-db-sync-fm985\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.227765 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.252218 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.255169 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.257659 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.258280 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.273252 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.288826 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.366008 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76c9r\" (UniqueName: \"kubernetes.io/projected/c3549b53-da9a-4aa4-b4b7-9e89eb017916-kube-api-access-76c9r\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.366316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3549b53-da9a-4aa4-b4b7-9e89eb017916-secret-volume\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.367446 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3549b53-da9a-4aa4-b4b7-9e89eb017916-config-volume\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.392858 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.403929 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-26rbh"] Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.475314 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-76c9r\" (UniqueName: \"kubernetes.io/projected/c3549b53-da9a-4aa4-b4b7-9e89eb017916-kube-api-access-76c9r\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.475371 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3549b53-da9a-4aa4-b4b7-9e89eb017916-secret-volume\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.475501 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3549b53-da9a-4aa4-b4b7-9e89eb017916-config-volume\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.479266 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3549b53-da9a-4aa4-b4b7-9e89eb017916-config-volume\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.483055 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3549b53-da9a-4aa4-b4b7-9e89eb017916-secret-volume\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.496287 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-76c9r\" (UniqueName: \"kubernetes.io/projected/c3549b53-da9a-4aa4-b4b7-9e89eb017916-kube-api-access-76c9r\") pod \"collect-profiles-29565285-mbxvz\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.612391 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.813298 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fm985"] Mar 19 10:45:00 crc kubenswrapper[4765]: W0319 10:45:00.828716 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podff89b19a_24e7_4a8c_abd4_cfc23a17d7c7.slice/crio-8949b9f93d06741e308df5a474ad242ab6acb3f060586d1296b85ee9f7b7e23f WatchSource:0}: Error finding container 8949b9f93d06741e308df5a474ad242ab6acb3f060586d1296b85ee9f7b7e23f: Status 404 returned error can't find the container with id 8949b9f93d06741e308df5a474ad242ab6acb3f060586d1296b85ee9f7b7e23f Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.875397 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f13d82bb-947c-43a3-8681-00c59cb179d9","Type":"ContainerStarted","Data":"c6a49a35bb52caedf44c4612ee987a233774bb01e11f636f401a616684ba3b35"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.877523 4765 generic.go:334] "Generic (PLEG): container finished" podID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerID="9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5" exitCode=0 Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.877588 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" event={"ID":"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d","Type":"ContainerDied","Data":"9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.877611 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" event={"ID":"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d","Type":"ContainerStarted","Data":"299c0466c94eeb8bfd55d5387030e2d130e6f61cf0e5bc3707be964495cadaf3"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.883150 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7cd8e59-b93c-4145-b7bc-c529a915598d","Type":"ContainerStarted","Data":"ab589258975ed644aaa4d702f91a30d6a9765ffabca7a1dcc44df87160e883d6"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.886064 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d924f586-eb76-40c9-a8d7-10f9f9935511","Type":"ContainerStarted","Data":"df962087ef50707c8839559b65025ac52b623036a637ed07e21a00959570d752"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.889163 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2","Type":"ContainerStarted","Data":"f98753adfeb5d1d9c506660df0f0261df6498cabd7502d3de99e1026c4e33101"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.905314 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7rjkg" event={"ID":"313b3021-c103-47ac-9cb5-b38e971d22fd","Type":"ContainerStarted","Data":"d41a94ebe6dac2fd803bb512fe51a25e92e5c37754d765f38f9c7968df6f685c"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.913037 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fm985" event={"ID":"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7","Type":"ContainerStarted","Data":"8949b9f93d06741e308df5a474ad242ab6acb3f060586d1296b85ee9f7b7e23f"} Mar 19 10:45:00 crc kubenswrapper[4765]: I0319 10:45:00.930682 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell0-cell-mapping-7rjkg" podStartSLOduration=2.930662448 podStartE2EDuration="2.930662448s" podCreationTimestamp="2026-03-19 10:44:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:00.920225866 +0000 UTC m=+1399.269171408" watchObservedRunningTime="2026-03-19 10:45:00.930662448 +0000 UTC m=+1399.279607990" Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.140609 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz"] Mar 19 10:45:01 crc kubenswrapper[4765]: W0319 10:45:01.159578 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3549b53_da9a_4aa4_b4b7_9e89eb017916.slice/crio-fb3008f5a9a03ad3344172f4c4e07b639ef91ee63b606850be97ddf7eb1a5128 WatchSource:0}: Error finding container fb3008f5a9a03ad3344172f4c4e07b639ef91ee63b606850be97ddf7eb1a5128: Status 404 returned error can't find the container with id fb3008f5a9a03ad3344172f4c4e07b639ef91ee63b606850be97ddf7eb1a5128 Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.930386 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fm985" event={"ID":"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7","Type":"ContainerStarted","Data":"746ccb08310d5fcb894b2b286fba879a7a461060b0fffa5c493abf918fecb04e"} Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.937425 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" event={"ID":"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d","Type":"ContainerStarted","Data":"ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845"} Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.938265 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.941789 4765 generic.go:334] "Generic (PLEG): container finished" podID="c3549b53-da9a-4aa4-b4b7-9e89eb017916" containerID="36f29a72904deb4ccb3bd3a786e69b9b23b838596177b6a63037afd7c4dcb067" exitCode=0 Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.942831 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" event={"ID":"c3549b53-da9a-4aa4-b4b7-9e89eb017916","Type":"ContainerDied","Data":"36f29a72904deb4ccb3bd3a786e69b9b23b838596177b6a63037afd7c4dcb067"} Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.942892 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" event={"ID":"c3549b53-da9a-4aa4-b4b7-9e89eb017916","Type":"ContainerStarted","Data":"fb3008f5a9a03ad3344172f4c4e07b639ef91ee63b606850be97ddf7eb1a5128"} Mar 19 10:45:01 crc kubenswrapper[4765]: I0319 10:45:01.989397 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-db-sync-fm985" podStartSLOduration=2.9893785619999997 podStartE2EDuration="2.989378562s" podCreationTimestamp="2026-03-19 10:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:01.94975333 +0000 UTC m=+1400.298698872" watchObservedRunningTime="2026-03-19 10:45:01.989378562 +0000 UTC m=+1400.338324094" Mar 19 10:45:02 crc kubenswrapper[4765]: I0319 10:45:02.027908 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" podStartSLOduration=3.027883572 podStartE2EDuration="3.027883572s" podCreationTimestamp="2026-03-19 10:44:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:02.002755317 +0000 UTC m=+1400.351700859" watchObservedRunningTime="2026-03-19 10:45:02.027883572 +0000 UTC m=+1400.376829114" Mar 19 10:45:02 crc kubenswrapper[4765]: I0319 10:45:02.064088 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 10:45:02 crc kubenswrapper[4765]: I0319 10:45:02.544545 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:02 crc kubenswrapper[4765]: I0319 10:45:02.571604 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.517362 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.618095 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3549b53-da9a-4aa4-b4b7-9e89eb017916-secret-volume\") pod \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.618289 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-76c9r\" (UniqueName: \"kubernetes.io/projected/c3549b53-da9a-4aa4-b4b7-9e89eb017916-kube-api-access-76c9r\") pod \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.618707 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3549b53-da9a-4aa4-b4b7-9e89eb017916-config-volume\") pod \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\" (UID: \"c3549b53-da9a-4aa4-b4b7-9e89eb017916\") " Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.620100 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3549b53-da9a-4aa4-b4b7-9e89eb017916-config-volume" (OuterVolumeSpecName: "config-volume") pod "c3549b53-da9a-4aa4-b4b7-9e89eb017916" (UID: "c3549b53-da9a-4aa4-b4b7-9e89eb017916"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.626862 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3549b53-da9a-4aa4-b4b7-9e89eb017916-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "c3549b53-da9a-4aa4-b4b7-9e89eb017916" (UID: "c3549b53-da9a-4aa4-b4b7-9e89eb017916"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.628692 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3549b53-da9a-4aa4-b4b7-9e89eb017916-kube-api-access-76c9r" (OuterVolumeSpecName: "kube-api-access-76c9r") pod "c3549b53-da9a-4aa4-b4b7-9e89eb017916" (UID: "c3549b53-da9a-4aa4-b4b7-9e89eb017916"). InnerVolumeSpecName "kube-api-access-76c9r". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.721137 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/c3549b53-da9a-4aa4-b4b7-9e89eb017916-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.721367 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/c3549b53-da9a-4aa4-b4b7-9e89eb017916-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:04 crc kubenswrapper[4765]: I0319 10:45:04.721454 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-76c9r\" (UniqueName: \"kubernetes.io/projected/c3549b53-da9a-4aa4-b4b7-9e89eb017916-kube-api-access-76c9r\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.025332 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f13d82bb-947c-43a3-8681-00c59cb179d9","Type":"ContainerStarted","Data":"b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b"} Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.029057 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7cd8e59-b93c-4145-b7bc-c529a915598d","Type":"ContainerStarted","Data":"1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d"} Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.029136 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-cell1-novncproxy-0" podUID="b7cd8e59-b93c-4145-b7bc-c529a915598d" containerName="nova-cell1-novncproxy-novncproxy" containerID="cri-o://1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d" gracePeriod=30 Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.031905 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d924f586-eb76-40c9-a8d7-10f9f9935511","Type":"ContainerStarted","Data":"41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc"} Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.032054 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-log" containerID="cri-o://41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc" gracePeriod=30 Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.032117 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-metadata" containerID="cri-o://653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a" gracePeriod=30 Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.036675 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2","Type":"ContainerStarted","Data":"8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302"} Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.036735 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2","Type":"ContainerStarted","Data":"c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98"} Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.041044 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" event={"ID":"c3549b53-da9a-4aa4-b4b7-9e89eb017916","Type":"ContainerDied","Data":"fb3008f5a9a03ad3344172f4c4e07b639ef91ee63b606850be97ddf7eb1a5128"} Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.041082 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fb3008f5a9a03ad3344172f4c4e07b639ef91ee63b606850be97ddf7eb1a5128" Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.041082 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz" Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.049861 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.6476362 podStartE2EDuration="7.049834359s" podCreationTimestamp="2026-03-19 10:44:58 +0000 UTC" firstStartedPulling="2026-03-19 10:44:59.954841699 +0000 UTC m=+1398.303787241" lastFinishedPulling="2026-03-19 10:45:04.357039858 +0000 UTC m=+1402.705985400" observedRunningTime="2026-03-19 10:45:05.043563333 +0000 UTC m=+1403.392508875" watchObservedRunningTime="2026-03-19 10:45:05.049834359 +0000 UTC m=+1403.398779901" Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.068535 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=1.6865168910000001 podStartE2EDuration="6.068516873s" podCreationTimestamp="2026-03-19 10:44:59 +0000 UTC" firstStartedPulling="2026-03-19 10:45:00.048178117 +0000 UTC m=+1398.397123659" lastFinishedPulling="2026-03-19 10:45:04.430178099 +0000 UTC m=+1402.779123641" observedRunningTime="2026-03-19 10:45:05.06662488 +0000 UTC m=+1403.415570452" watchObservedRunningTime="2026-03-19 10:45:05.068516873 +0000 UTC m=+1403.417462415" Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.107044 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.94639579 podStartE2EDuration="7.107025143s" podCreationTimestamp="2026-03-19 10:44:58 +0000 UTC" firstStartedPulling="2026-03-19 10:45:00.228423803 +0000 UTC m=+1398.577369345" lastFinishedPulling="2026-03-19 10:45:04.389053156 +0000 UTC m=+1402.737998698" observedRunningTime="2026-03-19 10:45:05.090561631 +0000 UTC m=+1403.439507193" watchObservedRunningTime="2026-03-19 10:45:05.107025143 +0000 UTC m=+1403.455970685" Mar 19 10:45:05 crc kubenswrapper[4765]: I0319 10:45:05.109582 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=2.138981201 podStartE2EDuration="6.109569224s" podCreationTimestamp="2026-03-19 10:44:59 +0000 UTC" firstStartedPulling="2026-03-19 10:45:00.418320379 +0000 UTC m=+1398.767265921" lastFinishedPulling="2026-03-19 10:45:04.388908402 +0000 UTC m=+1402.737853944" observedRunningTime="2026-03-19 10:45:05.106087877 +0000 UTC m=+1403.455033449" watchObservedRunningTime="2026-03-19 10:45:05.109569224 +0000 UTC m=+1403.458514766" Mar 19 10:45:06 crc kubenswrapper[4765]: I0319 10:45:06.052296 4765 generic.go:334] "Generic (PLEG): container finished" podID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerID="41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc" exitCode=143 Mar 19 10:45:06 crc kubenswrapper[4765]: I0319 10:45:06.052386 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d924f586-eb76-40c9-a8d7-10f9f9935511","Type":"ContainerStarted","Data":"653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a"} Mar 19 10:45:06 crc kubenswrapper[4765]: I0319 10:45:06.053449 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d924f586-eb76-40c9-a8d7-10f9f9935511","Type":"ContainerDied","Data":"41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc"} Mar 19 10:45:07 crc kubenswrapper[4765]: I0319 10:45:07.211026 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:45:07 crc kubenswrapper[4765]: I0319 10:45:07.211624 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/kube-state-metrics-0" podUID="2aac07ca-a4d1-4730-ad33-00f6c3d0e418" containerName="kube-state-metrics" containerID="cri-o://b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a" gracePeriod=30 Mar 19 10:45:07 crc kubenswrapper[4765]: I0319 10:45:07.696617 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 10:45:07 crc kubenswrapper[4765]: I0319 10:45:07.796052 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf6dm\" (UniqueName: \"kubernetes.io/projected/2aac07ca-a4d1-4730-ad33-00f6c3d0e418-kube-api-access-bf6dm\") pod \"2aac07ca-a4d1-4730-ad33-00f6c3d0e418\" (UID: \"2aac07ca-a4d1-4730-ad33-00f6c3d0e418\") " Mar 19 10:45:07 crc kubenswrapper[4765]: I0319 10:45:07.805544 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aac07ca-a4d1-4730-ad33-00f6c3d0e418-kube-api-access-bf6dm" (OuterVolumeSpecName: "kube-api-access-bf6dm") pod "2aac07ca-a4d1-4730-ad33-00f6c3d0e418" (UID: "2aac07ca-a4d1-4730-ad33-00f6c3d0e418"). InnerVolumeSpecName "kube-api-access-bf6dm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:07 crc kubenswrapper[4765]: I0319 10:45:07.898563 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf6dm\" (UniqueName: \"kubernetes.io/projected/2aac07ca-a4d1-4730-ad33-00f6c3d0e418-kube-api-access-bf6dm\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.073754 4765 generic.go:334] "Generic (PLEG): container finished" podID="2aac07ca-a4d1-4730-ad33-00f6c3d0e418" containerID="b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a" exitCode=2 Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.073820 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aac07ca-a4d1-4730-ad33-00f6c3d0e418","Type":"ContainerDied","Data":"b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a"} Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.073929 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"2aac07ca-a4d1-4730-ad33-00f6c3d0e418","Type":"ContainerDied","Data":"3223c31b0e270fc0aed39b662ac1ef5a200ba871886471426060c50c14abbeff"} Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.073979 4765 scope.go:117] "RemoveContainer" containerID="b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.073849 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.117539 4765 scope.go:117] "RemoveContainer" containerID="b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a" Mar 19 10:45:08 crc kubenswrapper[4765]: E0319 10:45:08.119183 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a\": container with ID starting with b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a not found: ID does not exist" containerID="b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.119230 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a"} err="failed to get container status \"b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a\": rpc error: code = NotFound desc = could not find container \"b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a\": container with ID starting with b8da5bb7acfe81498f9794593fc593277e21c3a90e12efe6e045cbdfe591333a not found: ID does not exist" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.130020 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.143743 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.156637 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:45:08 crc kubenswrapper[4765]: E0319 10:45:08.157491 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3549b53-da9a-4aa4-b4b7-9e89eb017916" containerName="collect-profiles" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.157601 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3549b53-da9a-4aa4-b4b7-9e89eb017916" containerName="collect-profiles" Mar 19 10:45:08 crc kubenswrapper[4765]: E0319 10:45:08.157719 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2aac07ca-a4d1-4730-ad33-00f6c3d0e418" containerName="kube-state-metrics" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.157791 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aac07ca-a4d1-4730-ad33-00f6c3d0e418" containerName="kube-state-metrics" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.169886 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3549b53-da9a-4aa4-b4b7-9e89eb017916" containerName="collect-profiles" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.169932 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2aac07ca-a4d1-4730-ad33-00f6c3d0e418" containerName="kube-state-metrics" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.170581 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.170680 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.175447 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-kube-state-metrics-svc" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.175447 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"kube-state-metrics-tls-config" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.203800 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.203868 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.203928 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q77xv\" (UniqueName: \"kubernetes.io/projected/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-api-access-q77xv\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.204030 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.305500 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.305754 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.305801 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q77xv\" (UniqueName: \"kubernetes.io/projected/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-api-access-q77xv\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.305860 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.315823 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-certs\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-state-metrics-tls-certs\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.372898 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-combined-ca-bundle\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.373859 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls-config\" (UniqueName: \"kubernetes.io/secret/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-state-metrics-tls-config\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.377331 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q77xv\" (UniqueName: \"kubernetes.io/projected/06359a74-a7cd-45ab-bc64-ef3d71373e5a-kube-api-access-q77xv\") pod \"kube-state-metrics-0\" (UID: \"06359a74-a7cd-45ab-bc64-ef3d71373e5a\") " pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.382602 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aac07ca-a4d1-4730-ad33-00f6c3d0e418" path="/var/lib/kubelet/pods/2aac07ca-a4d1-4730-ad33-00f6c3d0e418/volumes" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.491841 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/kube-state-metrics-0" Mar 19 10:45:08 crc kubenswrapper[4765]: I0319 10:45:08.983857 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/kube-state-metrics-0"] Mar 19 10:45:08 crc kubenswrapper[4765]: W0319 10:45:08.984027 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod06359a74_a7cd_45ab_bc64_ef3d71373e5a.slice/crio-9e52ecb3eda4ef95bb390de9ea4fc5661df123877254432ee74cbeb981882fa9 WatchSource:0}: Error finding container 9e52ecb3eda4ef95bb390de9ea4fc5661df123877254432ee74cbeb981882fa9: Status 404 returned error can't find the container with id 9e52ecb3eda4ef95bb390de9ea4fc5661df123877254432ee74cbeb981882fa9 Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.085601 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06359a74-a7cd-45ab-bc64-ef3d71373e5a","Type":"ContainerStarted","Data":"9e52ecb3eda4ef95bb390de9ea4fc5661df123877254432ee74cbeb981882fa9"} Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.087843 4765 generic.go:334] "Generic (PLEG): container finished" podID="ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" containerID="746ccb08310d5fcb894b2b286fba879a7a461060b0fffa5c493abf918fecb04e" exitCode=0 Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.087881 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fm985" event={"ID":"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7","Type":"ContainerDied","Data":"746ccb08310d5fcb894b2b286fba879a7a461060b0fffa5c493abf918fecb04e"} Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.293492 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.293555 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.303231 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.303269 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.306009 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.306291 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-central-agent" containerID="cri-o://7fd7c5b075326d2630e8509d2874b4a7931bd2d4fb28c1d2328de2daf3720f15" gracePeriod=30 Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.306681 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="proxy-httpd" containerID="cri-o://3a42f834745602a6c5e722e1de4dee0c09470d9cd8a7dbc46fa8cea16f066c33" gracePeriod=30 Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.306724 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="sg-core" containerID="cri-o://a28d6020624e0ff33f61930cb465b77be95b1eea5da41229812c2eaf07f0fd15" gracePeriod=30 Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.306755 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-notification-agent" containerID="cri-o://eaa3319bac0f28efb9983e12dd4ff24efc0e820215132e22f6f3366ccf6479f7" gracePeriod=30 Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.333090 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.656848 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.671650 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.751640 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-v7b7s"] Mar 19 10:45:09 crc kubenswrapper[4765]: I0319 10:45:09.751910 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" podUID="630c00dd-9d08-4035-88a2-0533792f2118" containerName="dnsmasq-dns" containerID="cri-o://ae106f7b6a7ef53eb6414ea44327434d663771da87646889db8d4223430fd2ed" gracePeriod=10 Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.103824 4765 generic.go:334] "Generic (PLEG): container finished" podID="313b3021-c103-47ac-9cb5-b38e971d22fd" containerID="d41a94ebe6dac2fd803bb512fe51a25e92e5c37754d765f38f9c7968df6f685c" exitCode=0 Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.104147 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7rjkg" event={"ID":"313b3021-c103-47ac-9cb5-b38e971d22fd","Type":"ContainerDied","Data":"d41a94ebe6dac2fd803bb512fe51a25e92e5c37754d765f38f9c7968df6f685c"} Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.146542 4765 generic.go:334] "Generic (PLEG): container finished" podID="630c00dd-9d08-4035-88a2-0533792f2118" containerID="ae106f7b6a7ef53eb6414ea44327434d663771da87646889db8d4223430fd2ed" exitCode=0 Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.146636 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" event={"ID":"630c00dd-9d08-4035-88a2-0533792f2118","Type":"ContainerDied","Data":"ae106f7b6a7ef53eb6414ea44327434d663771da87646889db8d4223430fd2ed"} Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.151326 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/kube-state-metrics-0" event={"ID":"06359a74-a7cd-45ab-bc64-ef3d71373e5a","Type":"ContainerStarted","Data":"9c83d6c0c31abf63e483db9414d669ea845c90c5af0fd6161cd4988a84cbfbd2"} Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.152374 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/kube-state-metrics-0" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.174343 4765 generic.go:334] "Generic (PLEG): container finished" podID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerID="3a42f834745602a6c5e722e1de4dee0c09470d9cd8a7dbc46fa8cea16f066c33" exitCode=0 Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.174387 4765 generic.go:334] "Generic (PLEG): container finished" podID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerID="a28d6020624e0ff33f61930cb465b77be95b1eea5da41229812c2eaf07f0fd15" exitCode=2 Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.174399 4765 generic.go:334] "Generic (PLEG): container finished" podID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerID="7fd7c5b075326d2630e8509d2874b4a7931bd2d4fb28c1d2328de2daf3720f15" exitCode=0 Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.175953 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerDied","Data":"3a42f834745602a6c5e722e1de4dee0c09470d9cd8a7dbc46fa8cea16f066c33"} Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.176004 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerDied","Data":"a28d6020624e0ff33f61930cb465b77be95b1eea5da41229812c2eaf07f0fd15"} Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.176022 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerDied","Data":"7fd7c5b075326d2630e8509d2874b4a7931bd2d4fb28c1d2328de2daf3720f15"} Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.216620 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.269365 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/kube-state-metrics-0" podStartSLOduration=1.852962851 podStartE2EDuration="2.26934211s" podCreationTimestamp="2026-03-19 10:45:08 +0000 UTC" firstStartedPulling="2026-03-19 10:45:08.988528926 +0000 UTC m=+1407.337474468" lastFinishedPulling="2026-03-19 10:45:09.404908185 +0000 UTC m=+1407.753853727" observedRunningTime="2026-03-19 10:45:10.19626113 +0000 UTC m=+1408.545206682" watchObservedRunningTime="2026-03-19 10:45:10.26934211 +0000 UTC m=+1408.618287652" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.381438 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.390331 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.390398 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.192:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.478485 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-swift-storage-0\") pod \"630c00dd-9d08-4035-88a2-0533792f2118\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.478656 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-config\") pod \"630c00dd-9d08-4035-88a2-0533792f2118\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.478756 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-nb\") pod \"630c00dd-9d08-4035-88a2-0533792f2118\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.478837 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-q4ds4\" (UniqueName: \"kubernetes.io/projected/630c00dd-9d08-4035-88a2-0533792f2118-kube-api-access-q4ds4\") pod \"630c00dd-9d08-4035-88a2-0533792f2118\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.478875 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-sb\") pod \"630c00dd-9d08-4035-88a2-0533792f2118\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.478947 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-svc\") pod \"630c00dd-9d08-4035-88a2-0533792f2118\" (UID: \"630c00dd-9d08-4035-88a2-0533792f2118\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.561847 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/630c00dd-9d08-4035-88a2-0533792f2118-kube-api-access-q4ds4" (OuterVolumeSpecName: "kube-api-access-q4ds4") pod "630c00dd-9d08-4035-88a2-0533792f2118" (UID: "630c00dd-9d08-4035-88a2-0533792f2118"). InnerVolumeSpecName "kube-api-access-q4ds4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.567069 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-config" (OuterVolumeSpecName: "config") pod "630c00dd-9d08-4035-88a2-0533792f2118" (UID: "630c00dd-9d08-4035-88a2-0533792f2118"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.581616 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.581645 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-q4ds4\" (UniqueName: \"kubernetes.io/projected/630c00dd-9d08-4035-88a2-0533792f2118-kube-api-access-q4ds4\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.589558 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "630c00dd-9d08-4035-88a2-0533792f2118" (UID: "630c00dd-9d08-4035-88a2-0533792f2118"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.619062 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "630c00dd-9d08-4035-88a2-0533792f2118" (UID: "630c00dd-9d08-4035-88a2-0533792f2118"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.626269 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "630c00dd-9d08-4035-88a2-0533792f2118" (UID: "630c00dd-9d08-4035-88a2-0533792f2118"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.673706 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "630c00dd-9d08-4035-88a2-0533792f2118" (UID: "630c00dd-9d08-4035-88a2-0533792f2118"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.685704 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.685738 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.685768 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.685781 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/630c00dd-9d08-4035-88a2-0533792f2118-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.735053 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.787257 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-config-data\") pod \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.787306 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-scripts\") pod \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.787338 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-combined-ca-bundle\") pod \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.787505 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lgssf\" (UniqueName: \"kubernetes.io/projected/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-kube-api-access-lgssf\") pod \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\" (UID: \"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7\") " Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.801606 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-kube-api-access-lgssf" (OuterVolumeSpecName: "kube-api-access-lgssf") pod "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" (UID: "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7"). InnerVolumeSpecName "kube-api-access-lgssf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.805512 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-scripts" (OuterVolumeSpecName: "scripts") pod "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" (UID: "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.821033 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" (UID: "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.821610 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-config-data" (OuterVolumeSpecName: "config-data") pod "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" (UID: "ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.889650 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lgssf\" (UniqueName: \"kubernetes.io/projected/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-kube-api-access-lgssf\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.889684 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.889696 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:10 crc kubenswrapper[4765]: I0319 10:45:10.889710 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.185334 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" event={"ID":"630c00dd-9d08-4035-88a2-0533792f2118","Type":"ContainerDied","Data":"a88904a133b3aff5a39e966448c7d4a16e8bb1f15038068d61889d0b1e5f50d4"} Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.185388 4765 scope.go:117] "RemoveContainer" containerID="ae106f7b6a7ef53eb6414ea44327434d663771da87646889db8d4223430fd2ed" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.185392 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-6578955fd5-v7b7s" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.207170 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-db-sync-fm985" event={"ID":"ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7","Type":"ContainerDied","Data":"8949b9f93d06741e308df5a474ad242ab6acb3f060586d1296b85ee9f7b7e23f"} Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.207224 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8949b9f93d06741e308df5a474ad242ab6acb3f060586d1296b85ee9f7b7e23f" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.207284 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-db-sync-fm985" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.211340 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 10:45:11 crc kubenswrapper[4765]: E0319 10:45:11.211786 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" containerName="nova-cell1-conductor-db-sync" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.211807 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" containerName="nova-cell1-conductor-db-sync" Mar 19 10:45:11 crc kubenswrapper[4765]: E0319 10:45:11.211830 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630c00dd-9d08-4035-88a2-0533792f2118" containerName="init" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.211838 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="630c00dd-9d08-4035-88a2-0533792f2118" containerName="init" Mar 19 10:45:11 crc kubenswrapper[4765]: E0319 10:45:11.211874 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="630c00dd-9d08-4035-88a2-0533792f2118" containerName="dnsmasq-dns" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.211881 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="630c00dd-9d08-4035-88a2-0533792f2118" containerName="dnsmasq-dns" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.212111 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" containerName="nova-cell1-conductor-db-sync" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.212138 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="630c00dd-9d08-4035-88a2-0533792f2118" containerName="dnsmasq-dns" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.212893 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.217196 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-conductor-config-data" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.225785 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.272326 4765 scope.go:117] "RemoveContainer" containerID="fd3dfdf815374707e8055bce011c2332d13dcdaaae50317da5e096bedf59b64e" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.293905 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-v7b7s"] Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.296664 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ead2164-cda8-432f-b397-2866c55ccdbd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.296718 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kd2ln\" (UniqueName: \"kubernetes.io/projected/1ead2164-cda8-432f-b397-2866c55ccdbd-kube-api-access-kd2ln\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.296783 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ead2164-cda8-432f-b397-2866c55ccdbd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.310054 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-6578955fd5-v7b7s"] Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.398590 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ead2164-cda8-432f-b397-2866c55ccdbd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.399109 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ead2164-cda8-432f-b397-2866c55ccdbd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.399147 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kd2ln\" (UniqueName: \"kubernetes.io/projected/1ead2164-cda8-432f-b397-2866c55ccdbd-kube-api-access-kd2ln\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.404195 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1ead2164-cda8-432f-b397-2866c55ccdbd-combined-ca-bundle\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.404942 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1ead2164-cda8-432f-b397-2866c55ccdbd-config-data\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.427290 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kd2ln\" (UniqueName: \"kubernetes.io/projected/1ead2164-cda8-432f-b397-2866c55ccdbd-kube-api-access-kd2ln\") pod \"nova-cell1-conductor-0\" (UID: \"1ead2164-cda8-432f-b397-2866c55ccdbd\") " pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.577607 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.578368 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.702994 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-combined-ca-bundle\") pod \"313b3021-c103-47ac-9cb5-b38e971d22fd\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.703093 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xhlls\" (UniqueName: \"kubernetes.io/projected/313b3021-c103-47ac-9cb5-b38e971d22fd-kube-api-access-xhlls\") pod \"313b3021-c103-47ac-9cb5-b38e971d22fd\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.703332 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-config-data\") pod \"313b3021-c103-47ac-9cb5-b38e971d22fd\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.703364 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-scripts\") pod \"313b3021-c103-47ac-9cb5-b38e971d22fd\" (UID: \"313b3021-c103-47ac-9cb5-b38e971d22fd\") " Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.708337 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/313b3021-c103-47ac-9cb5-b38e971d22fd-kube-api-access-xhlls" (OuterVolumeSpecName: "kube-api-access-xhlls") pod "313b3021-c103-47ac-9cb5-b38e971d22fd" (UID: "313b3021-c103-47ac-9cb5-b38e971d22fd"). InnerVolumeSpecName "kube-api-access-xhlls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.713491 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-scripts" (OuterVolumeSpecName: "scripts") pod "313b3021-c103-47ac-9cb5-b38e971d22fd" (UID: "313b3021-c103-47ac-9cb5-b38e971d22fd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.739090 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "313b3021-c103-47ac-9cb5-b38e971d22fd" (UID: "313b3021-c103-47ac-9cb5-b38e971d22fd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.743850 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-config-data" (OuterVolumeSpecName: "config-data") pod "313b3021-c103-47ac-9cb5-b38e971d22fd" (UID: "313b3021-c103-47ac-9cb5-b38e971d22fd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.805009 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.805035 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.805044 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/313b3021-c103-47ac-9cb5-b38e971d22fd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:11 crc kubenswrapper[4765]: I0319 10:45:11.805054 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xhlls\" (UniqueName: \"kubernetes.io/projected/313b3021-c103-47ac-9cb5-b38e971d22fd-kube-api-access-xhlls\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.081306 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-conductor-0"] Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.223704 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1ead2164-cda8-432f-b397-2866c55ccdbd","Type":"ContainerStarted","Data":"1c285444a84dd3a5eecb4f6b9ffb05037a5e4233e008161cb48cd35183a3960f"} Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.234885 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell0-cell-mapping-7rjkg" event={"ID":"313b3021-c103-47ac-9cb5-b38e971d22fd","Type":"ContainerDied","Data":"b1b11fe0913bfb11371e2e5db352daee604039b68c3936b48d2804630a0461d7"} Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.235206 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1b11fe0913bfb11371e2e5db352daee604039b68c3936b48d2804630a0461d7" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.235074 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell0-cell-mapping-7rjkg" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.277385 4765 generic.go:334] "Generic (PLEG): container finished" podID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerID="eaa3319bac0f28efb9983e12dd4ff24efc0e820215132e22f6f3366ccf6479f7" exitCode=0 Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.277403 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerDied","Data":"eaa3319bac0f28efb9983e12dd4ff24efc0e820215132e22f6f3366ccf6479f7"} Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.282597 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.282929 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-api" containerID="cri-o://8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302" gracePeriod=30 Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.283034 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-log" containerID="cri-o://c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98" gracePeriod=30 Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.407539 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="630c00dd-9d08-4035-88a2-0533792f2118" path="/var/lib/kubelet/pods/630c00dd-9d08-4035-88a2-0533792f2118/volumes" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.408209 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.408371 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="f13d82bb-947c-43a3-8681-00c59cb179d9" containerName="nova-scheduler-scheduler" containerID="cri-o://b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" gracePeriod=30 Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.427156 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:12 crc kubenswrapper[4765]: E0319 10:45:12.525888 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod313b3021_c103_47ac_9cb5_b38e971d22fd.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod313b3021_c103_47ac_9cb5_b38e971d22fd.slice/crio-b1b11fe0913bfb11371e2e5db352daee604039b68c3936b48d2804630a0461d7\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2637c2a0_7afc_42c7_b1e3_c6c1813dd1c2.slice/crio-c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98.scope\": RecentStats: unable to find data in memory cache]" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.533338 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-combined-ca-bundle\") pod \"a0afe23f-4c2a-4e60-9f57-189009a620f8\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.533401 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-scripts\") pod \"a0afe23f-4c2a-4e60-9f57-189009a620f8\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.533461 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-log-httpd\") pod \"a0afe23f-4c2a-4e60-9f57-189009a620f8\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.533524 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-run-httpd\") pod \"a0afe23f-4c2a-4e60-9f57-189009a620f8\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.533540 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rll9\" (UniqueName: \"kubernetes.io/projected/a0afe23f-4c2a-4e60-9f57-189009a620f8-kube-api-access-8rll9\") pod \"a0afe23f-4c2a-4e60-9f57-189009a620f8\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.533639 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-config-data\") pod \"a0afe23f-4c2a-4e60-9f57-189009a620f8\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.533690 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-sg-core-conf-yaml\") pod \"a0afe23f-4c2a-4e60-9f57-189009a620f8\" (UID: \"a0afe23f-4c2a-4e60-9f57-189009a620f8\") " Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.543979 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "a0afe23f-4c2a-4e60-9f57-189009a620f8" (UID: "a0afe23f-4c2a-4e60-9f57-189009a620f8"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.544731 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "a0afe23f-4c2a-4e60-9f57-189009a620f8" (UID: "a0afe23f-4c2a-4e60-9f57-189009a620f8"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.548310 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-scripts" (OuterVolumeSpecName: "scripts") pod "a0afe23f-4c2a-4e60-9f57-189009a620f8" (UID: "a0afe23f-4c2a-4e60-9f57-189009a620f8"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.571004 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0afe23f-4c2a-4e60-9f57-189009a620f8-kube-api-access-8rll9" (OuterVolumeSpecName: "kube-api-access-8rll9") pod "a0afe23f-4c2a-4e60-9f57-189009a620f8" (UID: "a0afe23f-4c2a-4e60-9f57-189009a620f8"). InnerVolumeSpecName "kube-api-access-8rll9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.606334 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "a0afe23f-4c2a-4e60-9f57-189009a620f8" (UID: "a0afe23f-4c2a-4e60-9f57-189009a620f8"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.635548 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.635588 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.635599 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/a0afe23f-4c2a-4e60-9f57-189009a620f8-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.635608 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8rll9\" (UniqueName: \"kubernetes.io/projected/a0afe23f-4c2a-4e60-9f57-189009a620f8-kube-api-access-8rll9\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.635616 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.663221 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "a0afe23f-4c2a-4e60-9f57-189009a620f8" (UID: "a0afe23f-4c2a-4e60-9f57-189009a620f8"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.690082 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-config-data" (OuterVolumeSpecName: "config-data") pod "a0afe23f-4c2a-4e60-9f57-189009a620f8" (UID: "a0afe23f-4c2a-4e60-9f57-189009a620f8"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.738055 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:12 crc kubenswrapper[4765]: I0319 10:45:12.738094 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a0afe23f-4c2a-4e60-9f57-189009a620f8-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.286368 4765 generic.go:334] "Generic (PLEG): container finished" podID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerID="c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98" exitCode=143 Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.286444 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2","Type":"ContainerDied","Data":"c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98"} Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.290279 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.290272 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"a0afe23f-4c2a-4e60-9f57-189009a620f8","Type":"ContainerDied","Data":"86899b78b87e834b4fc89c88af757294e2c6780beac29bb4f1a342f1cf7f60a7"} Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.290417 4765 scope.go:117] "RemoveContainer" containerID="3a42f834745602a6c5e722e1de4dee0c09470d9cd8a7dbc46fa8cea16f066c33" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.291880 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-conductor-0" event={"ID":"1ead2164-cda8-432f-b397-2866c55ccdbd","Type":"ContainerStarted","Data":"37a42f4c454f6cf7b98551d532d4430434ac5d6bd0b11016acca3e9f70a97900"} Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.292158 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.313280 4765 scope.go:117] "RemoveContainer" containerID="a28d6020624e0ff33f61930cb465b77be95b1eea5da41229812c2eaf07f0fd15" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.320406 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-conductor-0" podStartSLOduration=2.320389093 podStartE2EDuration="2.320389093s" podCreationTimestamp="2026-03-19 10:45:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:13.312059039 +0000 UTC m=+1411.661004591" watchObservedRunningTime="2026-03-19 10:45:13.320389093 +0000 UTC m=+1411.669334635" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.345890 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.349663 4765 scope.go:117] "RemoveContainer" containerID="eaa3319bac0f28efb9983e12dd4ff24efc0e820215132e22f6f3366ccf6479f7" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.356717 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.386497 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:13 crc kubenswrapper[4765]: E0319 10:45:13.409095 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="proxy-httpd" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409391 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="proxy-httpd" Mar 19 10:45:13 crc kubenswrapper[4765]: E0319 10:45:13.409438 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="sg-core" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409447 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="sg-core" Mar 19 10:45:13 crc kubenswrapper[4765]: E0319 10:45:13.409478 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-central-agent" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409487 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-central-agent" Mar 19 10:45:13 crc kubenswrapper[4765]: E0319 10:45:13.409496 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="313b3021-c103-47ac-9cb5-b38e971d22fd" containerName="nova-manage" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409504 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="313b3021-c103-47ac-9cb5-b38e971d22fd" containerName="nova-manage" Mar 19 10:45:13 crc kubenswrapper[4765]: E0319 10:45:13.409525 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-notification-agent" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409533 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-notification-agent" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409873 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-notification-agent" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409890 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="313b3021-c103-47ac-9cb5-b38e971d22fd" containerName="nova-manage" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409901 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="sg-core" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409909 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="proxy-httpd" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.409921 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" containerName="ceilometer-central-agent" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.412015 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.418811 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.419055 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.419168 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.420588 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.423975 4765 scope.go:117] "RemoveContainer" containerID="7fd7c5b075326d2630e8509d2874b4a7931bd2d4fb28c1d2328de2daf3720f15" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.552164 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.552334 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vsgwl\" (UniqueName: \"kubernetes.io/projected/7d7e9480-4187-4b89-9638-88131527013a-kube-api-access-vsgwl\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.552415 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.552547 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-scripts\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.552654 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-config-data\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.552934 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.553025 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-log-httpd\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.553076 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-run-httpd\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.655949 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vsgwl\" (UniqueName: \"kubernetes.io/projected/7d7e9480-4187-4b89-9638-88131527013a-kube-api-access-vsgwl\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.656037 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.656092 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-scripts\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.656130 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-config-data\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.656244 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.656277 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-log-httpd\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.656299 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-run-httpd\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.656329 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.657367 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-log-httpd\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.657796 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-run-httpd\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.665179 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.667047 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-scripts\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.667106 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.668148 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-config-data\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.668524 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.676561 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vsgwl\" (UniqueName: \"kubernetes.io/projected/7d7e9480-4187-4b89-9638-88131527013a-kube-api-access-vsgwl\") pod \"ceilometer-0\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " pod="openstack/ceilometer-0" Mar 19 10:45:13 crc kubenswrapper[4765]: I0319 10:45:13.792551 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:14 crc kubenswrapper[4765]: I0319 10:45:14.266110 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:14 crc kubenswrapper[4765]: E0319 10:45:14.303779 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 10:45:14 crc kubenswrapper[4765]: E0319 10:45:14.317320 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 10:45:14 crc kubenswrapper[4765]: E0319 10:45:14.321085 4765 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" containerID="b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" cmd=["/usr/bin/pgrep","-r","DRST","nova-scheduler"] Mar 19 10:45:14 crc kubenswrapper[4765]: E0319 10:45:14.321171 4765 prober.go:104] "Probe errored" err="rpc error: code = Unknown desc = command error: cannot register an exec PID: container is stopping, stdout: , stderr: , exit code -1" probeType="Readiness" pod="openstack/nova-scheduler-0" podUID="f13d82bb-947c-43a3-8681-00c59cb179d9" containerName="nova-scheduler-scheduler" Mar 19 10:45:14 crc kubenswrapper[4765]: I0319 10:45:14.323338 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerStarted","Data":"97e479c518f7867be143c283a95c89b5c1653012c06a9be0ac914c98b1ae05c9"} Mar 19 10:45:14 crc kubenswrapper[4765]: I0319 10:45:14.389050 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0afe23f-4c2a-4e60-9f57-189009a620f8" path="/var/lib/kubelet/pods/a0afe23f-4c2a-4e60-9f57-189009a620f8/volumes" Mar 19 10:45:15 crc kubenswrapper[4765]: I0319 10:45:15.340528 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerStarted","Data":"4dd5b51b3307c703257c218ba8122c5cdc6140ccc9759bf558e2c80c7c5bfea7"} Mar 19 10:45:16 crc kubenswrapper[4765]: I0319 10:45:16.351930 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerStarted","Data":"e27a91a7c3d1d859332550838dcccc7497458472981000c3c0a5177bc0bd5953"} Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.159538 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.242690 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-config-data\") pod \"f13d82bb-947c-43a3-8681-00c59cb179d9\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.242807 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-combined-ca-bundle\") pod \"f13d82bb-947c-43a3-8681-00c59cb179d9\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.242842 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45qvh\" (UniqueName: \"kubernetes.io/projected/f13d82bb-947c-43a3-8681-00c59cb179d9-kube-api-access-45qvh\") pod \"f13d82bb-947c-43a3-8681-00c59cb179d9\" (UID: \"f13d82bb-947c-43a3-8681-00c59cb179d9\") " Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.250481 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f13d82bb-947c-43a3-8681-00c59cb179d9-kube-api-access-45qvh" (OuterVolumeSpecName: "kube-api-access-45qvh") pod "f13d82bb-947c-43a3-8681-00c59cb179d9" (UID: "f13d82bb-947c-43a3-8681-00c59cb179d9"). InnerVolumeSpecName "kube-api-access-45qvh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.273869 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-config-data" (OuterVolumeSpecName: "config-data") pod "f13d82bb-947c-43a3-8681-00c59cb179d9" (UID: "f13d82bb-947c-43a3-8681-00c59cb179d9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.277896 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.281851 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "f13d82bb-947c-43a3-8681-00c59cb179d9" (UID: "f13d82bb-947c-43a3-8681-00c59cb179d9"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.338025 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.338076 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.344898 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-combined-ca-bundle\") pod \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345012 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-config-data\") pod \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345084 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-logs\") pod \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345261 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n9fbb\" (UniqueName: \"kubernetes.io/projected/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-kube-api-access-n9fbb\") pod \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\" (UID: \"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2\") " Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345457 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-logs" (OuterVolumeSpecName: "logs") pod "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" (UID: "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345827 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345847 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f13d82bb-947c-43a3-8681-00c59cb179d9-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345859 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45qvh\" (UniqueName: \"kubernetes.io/projected/f13d82bb-947c-43a3-8681-00c59cb179d9-kube-api-access-45qvh\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.345870 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.348042 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-kube-api-access-n9fbb" (OuterVolumeSpecName: "kube-api-access-n9fbb") pod "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" (UID: "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2"). InnerVolumeSpecName "kube-api-access-n9fbb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.370986 4765 generic.go:334] "Generic (PLEG): container finished" podID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerID="8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302" exitCode=0 Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.371158 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2","Type":"ContainerDied","Data":"8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302"} Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.371249 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2","Type":"ContainerDied","Data":"f98753adfeb5d1d9c506660df0f0261df6498cabd7502d3de99e1026c4e33101"} Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.371355 4765 scope.go:117] "RemoveContainer" containerID="8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.371594 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.374644 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" (UID: "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.375562 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-config-data" (OuterVolumeSpecName: "config-data") pod "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" (UID: "2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.377383 4765 generic.go:334] "Generic (PLEG): container finished" podID="f13d82bb-947c-43a3-8681-00c59cb179d9" containerID="b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" exitCode=0 Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.377441 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.377461 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f13d82bb-947c-43a3-8681-00c59cb179d9","Type":"ContainerDied","Data":"b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b"} Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.377493 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"f13d82bb-947c-43a3-8681-00c59cb179d9","Type":"ContainerDied","Data":"c6a49a35bb52caedf44c4612ee987a233774bb01e11f636f401a616684ba3b35"} Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.380337 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerStarted","Data":"85853a11e26a9c5d9f89867f55ca999dafce26f5cba02bc9efc2c3c82424db3c"} Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.401338 4765 scope.go:117] "RemoveContainer" containerID="c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.430858 4765 scope.go:117] "RemoveContainer" containerID="8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.431103 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: E0319 10:45:17.433373 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302\": container with ID starting with 8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302 not found: ID does not exist" containerID="8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.433432 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302"} err="failed to get container status \"8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302\": rpc error: code = NotFound desc = could not find container \"8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302\": container with ID starting with 8440c740ba45f159c4729acf2963d0a61a51e3e0517feb0917d667727b25a302 not found: ID does not exist" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.433478 4765 scope.go:117] "RemoveContainer" containerID="c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98" Mar 19 10:45:17 crc kubenswrapper[4765]: E0319 10:45:17.433774 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98\": container with ID starting with c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98 not found: ID does not exist" containerID="c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.433822 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98"} err="failed to get container status \"c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98\": rpc error: code = NotFound desc = could not find container \"c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98\": container with ID starting with c6d3884e7157d5721faedca5951c0847ab5132bf77003a1f3bc6e41d7f43ce98 not found: ID does not exist" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.433839 4765 scope.go:117] "RemoveContainer" containerID="b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.448114 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n9fbb\" (UniqueName: \"kubernetes.io/projected/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-kube-api-access-n9fbb\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.448148 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.448159 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.451447 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.461174 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: E0319 10:45:17.461730 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-log" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.461756 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-log" Mar 19 10:45:17 crc kubenswrapper[4765]: E0319 10:45:17.461779 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f13d82bb-947c-43a3-8681-00c59cb179d9" containerName="nova-scheduler-scheduler" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.461788 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f13d82bb-947c-43a3-8681-00c59cb179d9" containerName="nova-scheduler-scheduler" Mar 19 10:45:17 crc kubenswrapper[4765]: E0319 10:45:17.461811 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-api" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.461820 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-api" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.462071 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f13d82bb-947c-43a3-8681-00c59cb179d9" containerName="nova-scheduler-scheduler" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.462090 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-api" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.462111 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" containerName="nova-api-log" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.462912 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.464543 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.465498 4765 scope.go:117] "RemoveContainer" containerID="b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" Mar 19 10:45:17 crc kubenswrapper[4765]: E0319 10:45:17.466705 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b\": container with ID starting with b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b not found: ID does not exist" containerID="b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.466732 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b"} err="failed to get container status \"b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b\": rpc error: code = NotFound desc = could not find container \"b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b\": container with ID starting with b3735fe15633294e4a33c3c78214d0a3d599672e17b8d60467cc540e79b14f6b not found: ID does not exist" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.470796 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.550100 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-config-data\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.550204 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.550250 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4dxjt\" (UniqueName: \"kubernetes.io/projected/7c47edca-0f68-4cd2-ac44-b14fa99200bb-kube-api-access-4dxjt\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.652094 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-config-data\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.652214 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.652247 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4dxjt\" (UniqueName: \"kubernetes.io/projected/7c47edca-0f68-4cd2-ac44-b14fa99200bb-kube-api-access-4dxjt\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.656410 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.665340 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-config-data\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.669700 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4dxjt\" (UniqueName: \"kubernetes.io/projected/7c47edca-0f68-4cd2-ac44-b14fa99200bb-kube-api-access-4dxjt\") pod \"nova-scheduler-0\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.706798 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.725767 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.743078 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.756943 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.760695 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.769140 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.783503 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.885060 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83046945-46ea-4586-bf0a-3a992cf42900-logs\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.885121 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.885163 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-config-data\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.885237 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmjnb\" (UniqueName: \"kubernetes.io/projected/83046945-46ea-4586-bf0a-3a992cf42900-kube-api-access-fmjnb\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.988970 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83046945-46ea-4586-bf0a-3a992cf42900-logs\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.989273 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.989362 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-config-data\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.989497 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fmjnb\" (UniqueName: \"kubernetes.io/projected/83046945-46ea-4586-bf0a-3a992cf42900-kube-api-access-fmjnb\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.990279 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83046945-46ea-4586-bf0a-3a992cf42900-logs\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.996794 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:17 crc kubenswrapper[4765]: I0319 10:45:17.997199 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-config-data\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:18 crc kubenswrapper[4765]: I0319 10:45:18.018292 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmjnb\" (UniqueName: \"kubernetes.io/projected/83046945-46ea-4586-bf0a-3a992cf42900-kube-api-access-fmjnb\") pod \"nova-api-0\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " pod="openstack/nova-api-0" Mar 19 10:45:18 crc kubenswrapper[4765]: I0319 10:45:18.091510 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:18 crc kubenswrapper[4765]: I0319 10:45:18.377589 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2" path="/var/lib/kubelet/pods/2637c2a0-7afc-42c7-b1e3-c6c1813dd1c2/volumes" Mar 19 10:45:18 crc kubenswrapper[4765]: I0319 10:45:18.378688 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f13d82bb-947c-43a3-8681-00c59cb179d9" path="/var/lib/kubelet/pods/f13d82bb-947c-43a3-8681-00c59cb179d9/volumes" Mar 19 10:45:18 crc kubenswrapper[4765]: I0319 10:45:18.432778 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:18 crc kubenswrapper[4765]: I0319 10:45:18.522555 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/kube-state-metrics-0" Mar 19 10:45:18 crc kubenswrapper[4765]: I0319 10:45:18.602300 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.426414 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c47edca-0f68-4cd2-ac44-b14fa99200bb","Type":"ContainerStarted","Data":"f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd"} Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.427090 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c47edca-0f68-4cd2-ac44-b14fa99200bb","Type":"ContainerStarted","Data":"34c37318a2ac974a7be40e4c7cda0461e32359b5801aadda42dd75a1c49f36dd"} Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.428996 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83046945-46ea-4586-bf0a-3a992cf42900","Type":"ContainerStarted","Data":"2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48"} Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.429044 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83046945-46ea-4586-bf0a-3a992cf42900","Type":"ContainerStarted","Data":"14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836"} Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.429063 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83046945-46ea-4586-bf0a-3a992cf42900","Type":"ContainerStarted","Data":"3ab7e89950991438a1678e4e2b8e140e8192fe82d6354ae234d739b0cb532570"} Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.432388 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerStarted","Data":"a97cad91ce6f2cfd1126ce135233ee43d8d31e7a26e9c9d4ba5add6ddc17e4a6"} Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.432541 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.452944 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=2.452927772 podStartE2EDuration="2.452927772s" podCreationTimestamp="2026-03-19 10:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:19.445320019 +0000 UTC m=+1417.794265581" watchObservedRunningTime="2026-03-19 10:45:19.452927772 +0000 UTC m=+1417.801873314" Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.473952 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.473933342 podStartE2EDuration="2.473933342s" podCreationTimestamp="2026-03-19 10:45:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:19.468421227 +0000 UTC m=+1417.817366799" watchObservedRunningTime="2026-03-19 10:45:19.473933342 +0000 UTC m=+1417.822878884" Mar 19 10:45:19 crc kubenswrapper[4765]: I0319 10:45:19.497148 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.032538934 podStartE2EDuration="6.497133242s" podCreationTimestamp="2026-03-19 10:45:13 +0000 UTC" firstStartedPulling="2026-03-19 10:45:14.284486943 +0000 UTC m=+1412.633432485" lastFinishedPulling="2026-03-19 10:45:18.749081251 +0000 UTC m=+1417.098026793" observedRunningTime="2026-03-19 10:45:19.496929897 +0000 UTC m=+1417.845875459" watchObservedRunningTime="2026-03-19 10:45:19.497133242 +0000 UTC m=+1417.846078784" Mar 19 10:45:21 crc kubenswrapper[4765]: I0319 10:45:21.616537 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-conductor-0" Mar 19 10:45:22 crc kubenswrapper[4765]: I0319 10:45:22.784728 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 10:45:27 crc kubenswrapper[4765]: I0319 10:45:27.785336 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 10:45:27 crc kubenswrapper[4765]: I0319 10:45:27.821847 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 10:45:28 crc kubenswrapper[4765]: I0319 10:45:28.094244 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:45:28 crc kubenswrapper[4765]: I0319 10:45:28.094301 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:45:28 crc kubenswrapper[4765]: I0319 10:45:28.563757 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 10:45:29 crc kubenswrapper[4765]: I0319 10:45:29.176123 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-api" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:29 crc kubenswrapper[4765]: I0319 10:45:29.176132 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-log" probeResult="failure" output="Get \"http://10.217.0.202:8774/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:31 crc kubenswrapper[4765]: I0319 10:45:31.656567 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:45:31 crc kubenswrapper[4765]: I0319 10:45:31.656919 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.494435 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.504376 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.596185 4765 generic.go:334] "Generic (PLEG): container finished" podID="b7cd8e59-b93c-4145-b7bc-c529a915598d" containerID="1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d" exitCode=137 Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.596261 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7cd8e59-b93c-4145-b7bc-c529a915598d","Type":"ContainerDied","Data":"1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d"} Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.596270 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.596375 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b7cd8e59-b93c-4145-b7bc-c529a915598d","Type":"ContainerDied","Data":"ab589258975ed644aaa4d702f91a30d6a9765ffabca7a1dcc44df87160e883d6"} Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.596424 4765 scope.go:117] "RemoveContainer" containerID="1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.598657 4765 generic.go:334] "Generic (PLEG): container finished" podID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerID="653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a" exitCode=137 Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.598704 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d924f586-eb76-40c9-a8d7-10f9f9935511","Type":"ContainerDied","Data":"653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a"} Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.598722 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.598734 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"d924f586-eb76-40c9-a8d7-10f9f9935511","Type":"ContainerDied","Data":"df962087ef50707c8839559b65025ac52b623036a637ed07e21a00959570d752"} Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.618263 4765 scope.go:117] "RemoveContainer" containerID="1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d" Mar 19 10:45:35 crc kubenswrapper[4765]: E0319 10:45:35.618723 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d\": container with ID starting with 1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d not found: ID does not exist" containerID="1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.618756 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d"} err="failed to get container status \"1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d\": rpc error: code = NotFound desc = could not find container \"1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d\": container with ID starting with 1b86d8deeb75f041c831dd87ed8a06440c475be8b2ec3919bcded6bc99e0b42d not found: ID does not exist" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.618777 4765 scope.go:117] "RemoveContainer" containerID="653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.635139 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9qf7\" (UniqueName: \"kubernetes.io/projected/b7cd8e59-b93c-4145-b7bc-c529a915598d-kube-api-access-v9qf7\") pod \"b7cd8e59-b93c-4145-b7bc-c529a915598d\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.635538 4765 scope.go:117] "RemoveContainer" containerID="41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.636087 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d924f586-eb76-40c9-a8d7-10f9f9935511-logs\") pod \"d924f586-eb76-40c9-a8d7-10f9f9935511\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.636253 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-config-data\") pod \"d924f586-eb76-40c9-a8d7-10f9f9935511\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.636332 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-config-data\") pod \"b7cd8e59-b93c-4145-b7bc-c529a915598d\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.636379 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-combined-ca-bundle\") pod \"d924f586-eb76-40c9-a8d7-10f9f9935511\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.636421 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-combined-ca-bundle\") pod \"b7cd8e59-b93c-4145-b7bc-c529a915598d\" (UID: \"b7cd8e59-b93c-4145-b7bc-c529a915598d\") " Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.636441 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g6dqp\" (UniqueName: \"kubernetes.io/projected/d924f586-eb76-40c9-a8d7-10f9f9935511-kube-api-access-g6dqp\") pod \"d924f586-eb76-40c9-a8d7-10f9f9935511\" (UID: \"d924f586-eb76-40c9-a8d7-10f9f9935511\") " Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.637523 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d924f586-eb76-40c9-a8d7-10f9f9935511-logs" (OuterVolumeSpecName: "logs") pod "d924f586-eb76-40c9-a8d7-10f9f9935511" (UID: "d924f586-eb76-40c9-a8d7-10f9f9935511"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.637661 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/d924f586-eb76-40c9-a8d7-10f9f9935511-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.641983 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7cd8e59-b93c-4145-b7bc-c529a915598d-kube-api-access-v9qf7" (OuterVolumeSpecName: "kube-api-access-v9qf7") pod "b7cd8e59-b93c-4145-b7bc-c529a915598d" (UID: "b7cd8e59-b93c-4145-b7bc-c529a915598d"). InnerVolumeSpecName "kube-api-access-v9qf7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.649314 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d924f586-eb76-40c9-a8d7-10f9f9935511-kube-api-access-g6dqp" (OuterVolumeSpecName: "kube-api-access-g6dqp") pod "d924f586-eb76-40c9-a8d7-10f9f9935511" (UID: "d924f586-eb76-40c9-a8d7-10f9f9935511"). InnerVolumeSpecName "kube-api-access-g6dqp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.665556 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "d924f586-eb76-40c9-a8d7-10f9f9935511" (UID: "d924f586-eb76-40c9-a8d7-10f9f9935511"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.668108 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-config-data" (OuterVolumeSpecName: "config-data") pod "b7cd8e59-b93c-4145-b7bc-c529a915598d" (UID: "b7cd8e59-b93c-4145-b7bc-c529a915598d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.670096 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "b7cd8e59-b93c-4145-b7bc-c529a915598d" (UID: "b7cd8e59-b93c-4145-b7bc-c529a915598d"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.670544 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-config-data" (OuterVolumeSpecName: "config-data") pod "d924f586-eb76-40c9-a8d7-10f9f9935511" (UID: "d924f586-eb76-40c9-a8d7-10f9f9935511"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.739489 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.739550 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.739559 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/d924f586-eb76-40c9-a8d7-10f9f9935511-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.739571 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b7cd8e59-b93c-4145-b7bc-c529a915598d-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.739582 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g6dqp\" (UniqueName: \"kubernetes.io/projected/d924f586-eb76-40c9-a8d7-10f9f9935511-kube-api-access-g6dqp\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.739591 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9qf7\" (UniqueName: \"kubernetes.io/projected/b7cd8e59-b93c-4145-b7bc-c529a915598d-kube-api-access-v9qf7\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.760154 4765 scope.go:117] "RemoveContainer" containerID="653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a" Mar 19 10:45:35 crc kubenswrapper[4765]: E0319 10:45:35.760807 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a\": container with ID starting with 653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a not found: ID does not exist" containerID="653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.760842 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a"} err="failed to get container status \"653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a\": rpc error: code = NotFound desc = could not find container \"653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a\": container with ID starting with 653fdf72127ff457c6a05cee3a00ec1f29d1982c0ae52a69fdaef73efefb4a3a not found: ID does not exist" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.760860 4765 scope.go:117] "RemoveContainer" containerID="41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc" Mar 19 10:45:35 crc kubenswrapper[4765]: E0319 10:45:35.761428 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc\": container with ID starting with 41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc not found: ID does not exist" containerID="41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.761484 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc"} err="failed to get container status \"41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc\": rpc error: code = NotFound desc = could not find container \"41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc\": container with ID starting with 41c98f63e439a31ce9fe1b03c5867cc98db3c6e9f3ff2edb00734a3911ae3efc not found: ID does not exist" Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.946455 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.969022 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.979024 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:45:35 crc kubenswrapper[4765]: I0319 10:45:35.993114 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.009104 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:36 crc kubenswrapper[4765]: E0319 10:45:36.009753 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-log" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.009773 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-log" Mar 19 10:45:36 crc kubenswrapper[4765]: E0319 10:45:36.009792 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-metadata" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.009799 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-metadata" Mar 19 10:45:36 crc kubenswrapper[4765]: E0319 10:45:36.009817 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7cd8e59-b93c-4145-b7bc-c529a915598d" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.009823 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7cd8e59-b93c-4145-b7bc-c529a915598d" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.010032 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-metadata" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.010044 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7cd8e59-b93c-4145-b7bc-c529a915598d" containerName="nova-cell1-novncproxy-novncproxy" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.010056 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" containerName="nova-metadata-log" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.011156 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.027702 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.028415 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.038124 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.039582 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.041683 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-novncproxy-config-data" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.048666 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-vencrypt" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.048936 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-novncproxy-cell1-public-svc" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.051903 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.071036 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.096838 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.097097 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.151153 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.151450 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.151559 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-config-data\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.151711 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.151823 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09992c98-fab5-441f-b4a5-abc94d180a52-logs\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.151909 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.152017 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.152137 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.152250 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ntbq5\" (UniqueName: \"kubernetes.io/projected/09992c98-fab5-441f-b4a5-abc94d180a52-kube-api-access-ntbq5\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.152354 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x46tc\" (UniqueName: \"kubernetes.io/projected/b92cc2fe-932d-4290-8331-225b2c5011d4-kube-api-access-x46tc\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.253943 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254083 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254149 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-config-data\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254223 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254300 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09992c98-fab5-441f-b4a5-abc94d180a52-logs\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254384 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254409 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254433 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ntbq5\" (UniqueName: \"kubernetes.io/projected/09992c98-fab5-441f-b4a5-abc94d180a52-kube-api-access-ntbq5\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.254510 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x46tc\" (UniqueName: \"kubernetes.io/projected/b92cc2fe-932d-4290-8331-225b2c5011d4-kube-api-access-x46tc\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.255171 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09992c98-fab5-441f-b4a5-abc94d180a52-logs\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.257424 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-config-data\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.258853 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-combined-ca-bundle\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.258910 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-config-data\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.259605 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"vencrypt-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-vencrypt-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.268269 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-novncproxy-tls-certs\" (UniqueName: \"kubernetes.io/secret/b92cc2fe-932d-4290-8331-225b2c5011d4-nova-novncproxy-tls-certs\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.268865 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.270494 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.274216 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ntbq5\" (UniqueName: \"kubernetes.io/projected/09992c98-fab5-441f-b4a5-abc94d180a52-kube-api-access-ntbq5\") pod \"nova-metadata-0\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.274425 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x46tc\" (UniqueName: \"kubernetes.io/projected/b92cc2fe-932d-4290-8331-225b2c5011d4-kube-api-access-x46tc\") pod \"nova-cell1-novncproxy-0\" (UID: \"b92cc2fe-932d-4290-8331-225b2c5011d4\") " pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.365480 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.368342 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7cd8e59-b93c-4145-b7bc-c529a915598d" path="/var/lib/kubelet/pods/b7cd8e59-b93c-4145-b7bc-c529a915598d/volumes" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.369152 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d924f586-eb76-40c9-a8d7-10f9f9935511" path="/var/lib/kubelet/pods/d924f586-eb76-40c9-a8d7-10f9f9935511/volumes" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.385649 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.850037 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:36 crc kubenswrapper[4765]: W0319 10:45:36.931451 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb92cc2fe_932d_4290_8331_225b2c5011d4.slice/crio-3b67efb6a731f513f5db0d04bb61d8226d7e2d232c095a54695e9527137276de WatchSource:0}: Error finding container 3b67efb6a731f513f5db0d04bb61d8226d7e2d232c095a54695e9527137276de: Status 404 returned error can't find the container with id 3b67efb6a731f513f5db0d04bb61d8226d7e2d232c095a54695e9527137276de Mar 19 10:45:36 crc kubenswrapper[4765]: I0319 10:45:36.932906 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-novncproxy-0"] Mar 19 10:45:37 crc kubenswrapper[4765]: I0319 10:45:37.624862 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09992c98-fab5-441f-b4a5-abc94d180a52","Type":"ContainerStarted","Data":"a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e"} Mar 19 10:45:37 crc kubenswrapper[4765]: I0319 10:45:37.624918 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09992c98-fab5-441f-b4a5-abc94d180a52","Type":"ContainerStarted","Data":"d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855"} Mar 19 10:45:37 crc kubenswrapper[4765]: I0319 10:45:37.624934 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09992c98-fab5-441f-b4a5-abc94d180a52","Type":"ContainerStarted","Data":"949c31618f239689481e7cdfa83fe84d09518dd5a8d109ab2480cade682ccbe2"} Mar 19 10:45:37 crc kubenswrapper[4765]: I0319 10:45:37.627030 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b92cc2fe-932d-4290-8331-225b2c5011d4","Type":"ContainerStarted","Data":"c7cec0689cc1843d5c9b6e24c2b16f21346a7620e67f5c52bf21746226e81c33"} Mar 19 10:45:37 crc kubenswrapper[4765]: I0319 10:45:37.627061 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-novncproxy-0" event={"ID":"b92cc2fe-932d-4290-8331-225b2c5011d4","Type":"ContainerStarted","Data":"3b67efb6a731f513f5db0d04bb61d8226d7e2d232c095a54695e9527137276de"} Mar 19 10:45:37 crc kubenswrapper[4765]: I0319 10:45:37.646520 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.646502104 podStartE2EDuration="2.646502104s" podCreationTimestamp="2026-03-19 10:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:37.643177173 +0000 UTC m=+1435.992122715" watchObservedRunningTime="2026-03-19 10:45:37.646502104 +0000 UTC m=+1435.995447646" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.098517 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.098904 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.104234 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.104306 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.140325 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-novncproxy-0" podStartSLOduration=3.140298302 podStartE2EDuration="3.140298302s" podCreationTimestamp="2026-03-19 10:45:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:37.670289962 +0000 UTC m=+1436.019235514" watchObservedRunningTime="2026-03-19 10:45:38.140298302 +0000 UTC m=+1436.489243844" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.307349 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bq6nt"] Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.308787 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.363363 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bq6nt"] Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.405284 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.405351 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.405381 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.405433 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.405490 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-config\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.405518 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljx4w\" (UniqueName: \"kubernetes.io/projected/949ced49-5178-4806-a521-3b46431783ba-kube-api-access-ljx4w\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.508222 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.508349 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-config\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.508390 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ljx4w\" (UniqueName: \"kubernetes.io/projected/949ced49-5178-4806-a521-3b46431783ba-kube-api-access-ljx4w\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.508471 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.508501 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.508526 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.509712 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-svc\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.510095 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-sb\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.510871 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-config\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.510922 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-nb\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.511459 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-swift-storage-0\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.550448 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ljx4w\" (UniqueName: \"kubernetes.io/projected/949ced49-5178-4806-a521-3b46431783ba-kube-api-access-ljx4w\") pod \"dnsmasq-dns-cd5cbd7b9-bq6nt\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:38 crc kubenswrapper[4765]: I0319 10:45:38.650132 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:39 crc kubenswrapper[4765]: I0319 10:45:39.182927 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bq6nt"] Mar 19 10:45:39 crc kubenswrapper[4765]: W0319 10:45:39.183420 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod949ced49_5178_4806_a521_3b46431783ba.slice/crio-d1bd3b56ab09c6f8b3f781e9ac52ab8b319a0778e6691f4f010f31d1c9797135 WatchSource:0}: Error finding container d1bd3b56ab09c6f8b3f781e9ac52ab8b319a0778e6691f4f010f31d1c9797135: Status 404 returned error can't find the container with id d1bd3b56ab09c6f8b3f781e9ac52ab8b319a0778e6691f4f010f31d1c9797135 Mar 19 10:45:39 crc kubenswrapper[4765]: I0319 10:45:39.650158 4765 generic.go:334] "Generic (PLEG): container finished" podID="949ced49-5178-4806-a521-3b46431783ba" containerID="91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4" exitCode=0 Mar 19 10:45:39 crc kubenswrapper[4765]: I0319 10:45:39.650373 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" event={"ID":"949ced49-5178-4806-a521-3b46431783ba","Type":"ContainerDied","Data":"91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4"} Mar 19 10:45:39 crc kubenswrapper[4765]: I0319 10:45:39.650461 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" event={"ID":"949ced49-5178-4806-a521-3b46431783ba","Type":"ContainerStarted","Data":"d1bd3b56ab09c6f8b3f781e9ac52ab8b319a0778e6691f4f010f31d1c9797135"} Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.627756 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.629148 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-central-agent" containerID="cri-o://4dd5b51b3307c703257c218ba8122c5cdc6140ccc9759bf558e2c80c7c5bfea7" gracePeriod=30 Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.629245 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-notification-agent" containerID="cri-o://e27a91a7c3d1d859332550838dcccc7497458472981000c3c0a5177bc0bd5953" gracePeriod=30 Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.629172 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="sg-core" containerID="cri-o://85853a11e26a9c5d9f89867f55ca999dafce26f5cba02bc9efc2c3c82424db3c" gracePeriod=30 Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.629426 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/ceilometer-0" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="proxy-httpd" containerID="cri-o://a97cad91ce6f2cfd1126ce135233ee43d8d31e7a26e9c9d4ba5add6ddc17e4a6" gracePeriod=30 Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.635430 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/ceilometer-0" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="proxy-httpd" probeResult="failure" output="Get \"https://10.217.0.200:3000/\": EOF" Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.663111 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" event={"ID":"949ced49-5178-4806-a521-3b46431783ba","Type":"ContainerStarted","Data":"dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6"} Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.663604 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.686323 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" podStartSLOduration=2.686304094 podStartE2EDuration="2.686304094s" podCreationTimestamp="2026-03-19 10:45:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:40.681421551 +0000 UTC m=+1439.030367093" watchObservedRunningTime="2026-03-19 10:45:40.686304094 +0000 UTC m=+1439.035249636" Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.767535 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.767747 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-log" containerID="cri-o://14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836" gracePeriod=30 Mar 19 10:45:40 crc kubenswrapper[4765]: I0319 10:45:40.768218 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-api" containerID="cri-o://2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48" gracePeriod=30 Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.386749 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.674669 4765 generic.go:334] "Generic (PLEG): container finished" podID="83046945-46ea-4586-bf0a-3a992cf42900" containerID="14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836" exitCode=143 Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.674759 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83046945-46ea-4586-bf0a-3a992cf42900","Type":"ContainerDied","Data":"14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836"} Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.677999 4765 generic.go:334] "Generic (PLEG): container finished" podID="7d7e9480-4187-4b89-9638-88131527013a" containerID="a97cad91ce6f2cfd1126ce135233ee43d8d31e7a26e9c9d4ba5add6ddc17e4a6" exitCode=0 Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.678032 4765 generic.go:334] "Generic (PLEG): container finished" podID="7d7e9480-4187-4b89-9638-88131527013a" containerID="85853a11e26a9c5d9f89867f55ca999dafce26f5cba02bc9efc2c3c82424db3c" exitCode=2 Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.678041 4765 generic.go:334] "Generic (PLEG): container finished" podID="7d7e9480-4187-4b89-9638-88131527013a" containerID="e27a91a7c3d1d859332550838dcccc7497458472981000c3c0a5177bc0bd5953" exitCode=0 Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.678049 4765 generic.go:334] "Generic (PLEG): container finished" podID="7d7e9480-4187-4b89-9638-88131527013a" containerID="4dd5b51b3307c703257c218ba8122c5cdc6140ccc9759bf558e2c80c7c5bfea7" exitCode=0 Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.678076 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerDied","Data":"a97cad91ce6f2cfd1126ce135233ee43d8d31e7a26e9c9d4ba5add6ddc17e4a6"} Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.678124 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerDied","Data":"85853a11e26a9c5d9f89867f55ca999dafce26f5cba02bc9efc2c3c82424db3c"} Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.678135 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerDied","Data":"e27a91a7c3d1d859332550838dcccc7497458472981000c3c0a5177bc0bd5953"} Mar 19 10:45:41 crc kubenswrapper[4765]: I0319 10:45:41.678145 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerDied","Data":"4dd5b51b3307c703257c218ba8122c5cdc6140ccc9759bf558e2c80c7c5bfea7"} Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.376835 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.491702 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-run-httpd\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492030 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-ceilometer-tls-certs\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492053 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492083 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-config-data\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492172 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vsgwl\" (UniqueName: \"kubernetes.io/projected/7d7e9480-4187-4b89-9638-88131527013a-kube-api-access-vsgwl\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492244 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-combined-ca-bundle\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492287 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-sg-core-conf-yaml\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492334 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-log-httpd\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492379 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-scripts\") pod \"7d7e9480-4187-4b89-9638-88131527013a\" (UID: \"7d7e9480-4187-4b89-9638-88131527013a\") " Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.492766 4765 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-run-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.493014 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.497500 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-scripts" (OuterVolumeSpecName: "scripts") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.498294 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d7e9480-4187-4b89-9638-88131527013a-kube-api-access-vsgwl" (OuterVolumeSpecName: "kube-api-access-vsgwl") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "kube-api-access-vsgwl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.525291 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-sg-core-conf-yaml" (OuterVolumeSpecName: "sg-core-conf-yaml") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "sg-core-conf-yaml". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.547980 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-ceilometer-tls-certs" (OuterVolumeSpecName: "ceilometer-tls-certs") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "ceilometer-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.583940 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.594366 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.594409 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-ceilometer-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.594424 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vsgwl\" (UniqueName: \"kubernetes.io/projected/7d7e9480-4187-4b89-9638-88131527013a-kube-api-access-vsgwl\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.594436 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.594448 4765 reconciler_common.go:293] "Volume detached for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-sg-core-conf-yaml\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.594459 4765 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/7d7e9480-4187-4b89-9638-88131527013a-log-httpd\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.610102 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-config-data" (OuterVolumeSpecName: "config-data") pod "7d7e9480-4187-4b89-9638-88131527013a" (UID: "7d7e9480-4187-4b89-9638-88131527013a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.690388 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"7d7e9480-4187-4b89-9638-88131527013a","Type":"ContainerDied","Data":"97e479c518f7867be143c283a95c89b5c1653012c06a9be0ac914c98b1ae05c9"} Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.690457 4765 scope.go:117] "RemoveContainer" containerID="a97cad91ce6f2cfd1126ce135233ee43d8d31e7a26e9c9d4ba5add6ddc17e4a6" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.690460 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.695566 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7d7e9480-4187-4b89-9638-88131527013a-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.713544 4765 scope.go:117] "RemoveContainer" containerID="85853a11e26a9c5d9f89867f55ca999dafce26f5cba02bc9efc2c3c82424db3c" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.730772 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.742216 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.750687 4765 scope.go:117] "RemoveContainer" containerID="e27a91a7c3d1d859332550838dcccc7497458472981000c3c0a5177bc0bd5953" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.751299 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:42 crc kubenswrapper[4765]: E0319 10:45:42.751672 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="sg-core" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.751691 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="sg-core" Mar 19 10:45:42 crc kubenswrapper[4765]: E0319 10:45:42.751710 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-notification-agent" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.751717 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-notification-agent" Mar 19 10:45:42 crc kubenswrapper[4765]: E0319 10:45:42.751732 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-central-agent" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.751738 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-central-agent" Mar 19 10:45:42 crc kubenswrapper[4765]: E0319 10:45:42.751751 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="proxy-httpd" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.751757 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="proxy-httpd" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.751941 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-central-agent" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.752043 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="proxy-httpd" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.752057 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="ceilometer-notification-agent" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.752073 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d7e9480-4187-4b89-9638-88131527013a" containerName="sg-core" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.753613 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.756927 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-config-data" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.757168 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-scripts" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.757286 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-ceilometer-internal-svc" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.762749 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.779780 4765 scope.go:117] "RemoveContainer" containerID="4dd5b51b3307c703257c218ba8122c5cdc6140ccc9759bf558e2c80c7c5bfea7" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797123 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ff62acd-7f88-46c9-bd52-150092370b2d-run-httpd\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ff62acd-7f88-46c9-bd52-150092370b2d-log-httpd\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797197 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797237 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797298 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-config-data\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797336 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-scripts\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797371 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9vlg\" (UniqueName: \"kubernetes.io/projected/5ff62acd-7f88-46c9-bd52-150092370b2d-kube-api-access-d9vlg\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.797400 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899274 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-config-data\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899350 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-scripts\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899392 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d9vlg\" (UniqueName: \"kubernetes.io/projected/5ff62acd-7f88-46c9-bd52-150092370b2d-kube-api-access-d9vlg\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899422 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899469 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ff62acd-7f88-46c9-bd52-150092370b2d-run-httpd\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899493 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ff62acd-7f88-46c9-bd52-150092370b2d-log-httpd\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899506 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.899539 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.903262 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-tls-certs\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-ceilometer-tls-certs\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.903525 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ff62acd-7f88-46c9-bd52-150092370b2d-run-httpd\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.903763 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/5ff62acd-7f88-46c9-bd52-150092370b2d-log-httpd\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.905030 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-combined-ca-bundle\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.908030 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sg-core-conf-yaml\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-sg-core-conf-yaml\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.908952 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-scripts\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.909814 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5ff62acd-7f88-46c9-bd52-150092370b2d-config-data\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:42 crc kubenswrapper[4765]: I0319 10:45:42.922795 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9vlg\" (UniqueName: \"kubernetes.io/projected/5ff62acd-7f88-46c9-bd52-150092370b2d-kube-api-access-d9vlg\") pod \"ceilometer-0\" (UID: \"5ff62acd-7f88-46c9-bd52-150092370b2d\") " pod="openstack/ceilometer-0" Mar 19 10:45:43 crc kubenswrapper[4765]: I0319 10:45:43.072760 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ceilometer-0" Mar 19 10:45:43 crc kubenswrapper[4765]: I0319 10:45:43.536821 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ceilometer-0"] Mar 19 10:45:43 crc kubenswrapper[4765]: I0319 10:45:43.699417 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ff62acd-7f88-46c9-bd52-150092370b2d","Type":"ContainerStarted","Data":"cceda40d6ba8d2646ea1f81e4bd0053049ded6cc5e8b45eca41364e8f07fc581"} Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.368690 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d7e9480-4187-4b89-9638-88131527013a" path="/var/lib/kubelet/pods/7d7e9480-4187-4b89-9638-88131527013a/volumes" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.486484 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.545262 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-config-data\") pod \"83046945-46ea-4586-bf0a-3a992cf42900\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.548681 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmjnb\" (UniqueName: \"kubernetes.io/projected/83046945-46ea-4586-bf0a-3a992cf42900-kube-api-access-fmjnb\") pod \"83046945-46ea-4586-bf0a-3a992cf42900\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.548851 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83046945-46ea-4586-bf0a-3a992cf42900-logs\") pod \"83046945-46ea-4586-bf0a-3a992cf42900\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.548876 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-combined-ca-bundle\") pod \"83046945-46ea-4586-bf0a-3a992cf42900\" (UID: \"83046945-46ea-4586-bf0a-3a992cf42900\") " Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.549799 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83046945-46ea-4586-bf0a-3a992cf42900-logs" (OuterVolumeSpecName: "logs") pod "83046945-46ea-4586-bf0a-3a992cf42900" (UID: "83046945-46ea-4586-bf0a-3a992cf42900"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.550948 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83046945-46ea-4586-bf0a-3a992cf42900-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.565446 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83046945-46ea-4586-bf0a-3a992cf42900-kube-api-access-fmjnb" (OuterVolumeSpecName: "kube-api-access-fmjnb") pod "83046945-46ea-4586-bf0a-3a992cf42900" (UID: "83046945-46ea-4586-bf0a-3a992cf42900"). InnerVolumeSpecName "kube-api-access-fmjnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.578188 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-config-data" (OuterVolumeSpecName: "config-data") pod "83046945-46ea-4586-bf0a-3a992cf42900" (UID: "83046945-46ea-4586-bf0a-3a992cf42900"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.581748 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "83046945-46ea-4586-bf0a-3a992cf42900" (UID: "83046945-46ea-4586-bf0a-3a992cf42900"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.653437 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.653499 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fmjnb\" (UniqueName: \"kubernetes.io/projected/83046945-46ea-4586-bf0a-3a992cf42900-kube-api-access-fmjnb\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.653518 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/83046945-46ea-4586-bf0a-3a992cf42900-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.715947 4765 generic.go:334] "Generic (PLEG): container finished" podID="83046945-46ea-4586-bf0a-3a992cf42900" containerID="2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48" exitCode=0 Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.716025 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83046945-46ea-4586-bf0a-3a992cf42900","Type":"ContainerDied","Data":"2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48"} Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.716051 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"83046945-46ea-4586-bf0a-3a992cf42900","Type":"ContainerDied","Data":"3ab7e89950991438a1678e4e2b8e140e8192fe82d6354ae234d739b0cb532570"} Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.716066 4765 scope.go:117] "RemoveContainer" containerID="2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.716191 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.738856 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ff62acd-7f88-46c9-bd52-150092370b2d","Type":"ContainerStarted","Data":"493f92650494b0a1a94e0c8c0e04915c352dcac8fbcdbbf7ee54e110872368ab"} Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.758117 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.769719 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.776531 4765 scope.go:117] "RemoveContainer" containerID="14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.786008 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:44 crc kubenswrapper[4765]: E0319 10:45:44.786527 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-api" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.786546 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-api" Mar 19 10:45:44 crc kubenswrapper[4765]: E0319 10:45:44.786570 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-log" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.786577 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-log" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.786785 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-log" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.786819 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="83046945-46ea-4586-bf0a-3a992cf42900" containerName="nova-api-api" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.787832 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.793416 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.793581 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.793697 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.800585 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.838282 4765 scope.go:117] "RemoveContainer" containerID="2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48" Mar 19 10:45:44 crc kubenswrapper[4765]: E0319 10:45:44.839489 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48\": container with ID starting with 2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48 not found: ID does not exist" containerID="2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.839523 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48"} err="failed to get container status \"2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48\": rpc error: code = NotFound desc = could not find container \"2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48\": container with ID starting with 2f871f65b47cb05f427aaf0e2b982616f5c6e706aba3bafb6424e2315955dc48 not found: ID does not exist" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.839547 4765 scope.go:117] "RemoveContainer" containerID="14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836" Mar 19 10:45:44 crc kubenswrapper[4765]: E0319 10:45:44.839836 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836\": container with ID starting with 14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836 not found: ID does not exist" containerID="14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.839858 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836"} err="failed to get container status \"14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836\": rpc error: code = NotFound desc = could not find container \"14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836\": container with ID starting with 14067a844015a7028308b2fc2554e03f1f0484d84436a5acf1ca643e72476836 not found: ID does not exist" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.856351 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvhhz\" (UniqueName: \"kubernetes.io/projected/483ee338-f847-4a1f-b644-aaf67fcd8fee-kube-api-access-bvhhz\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.856614 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.856735 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.856839 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-config-data\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.856951 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-public-tls-certs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.857071 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483ee338-f847-4a1f-b644-aaf67fcd8fee-logs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.959150 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.959216 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.959272 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-config-data\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.959357 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-public-tls-certs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.959399 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483ee338-f847-4a1f-b644-aaf67fcd8fee-logs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.959497 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvhhz\" (UniqueName: \"kubernetes.io/projected/483ee338-f847-4a1f-b644-aaf67fcd8fee-kube-api-access-bvhhz\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.960053 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483ee338-f847-4a1f-b644-aaf67fcd8fee-logs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.964028 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-public-tls-certs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.964068 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-internal-tls-certs\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.964420 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.964632 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-config-data\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:44 crc kubenswrapper[4765]: I0319 10:45:44.976638 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvhhz\" (UniqueName: \"kubernetes.io/projected/483ee338-f847-4a1f-b644-aaf67fcd8fee-kube-api-access-bvhhz\") pod \"nova-api-0\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " pod="openstack/nova-api-0" Mar 19 10:45:45 crc kubenswrapper[4765]: I0319 10:45:45.118615 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:45:45 crc kubenswrapper[4765]: I0319 10:45:45.854780 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:45 crc kubenswrapper[4765]: W0319 10:45:45.858234 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod483ee338_f847_4a1f_b644_aaf67fcd8fee.slice/crio-520305eaafe4bae2782db5b5065ab350de090847f97aa66eb848f494ddb6ed45 WatchSource:0}: Error finding container 520305eaafe4bae2782db5b5065ab350de090847f97aa66eb848f494ddb6ed45: Status 404 returned error can't find the container with id 520305eaafe4bae2782db5b5065ab350de090847f97aa66eb848f494ddb6ed45 Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.368285 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83046945-46ea-4586-bf0a-3a992cf42900" path="/var/lib/kubelet/pods/83046945-46ea-4586-bf0a-3a992cf42900/volumes" Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.369276 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.369323 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.386365 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.407240 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.763755 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ff62acd-7f88-46c9-bd52-150092370b2d","Type":"ContainerStarted","Data":"20396377b8bf05acf9e398165231ceeee43e68b7955b8a857a350750001aaf71"} Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.763808 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ff62acd-7f88-46c9-bd52-150092370b2d","Type":"ContainerStarted","Data":"b47fcc9771835f5506ca63503851405813edd8b8ae0c21b6484f6646507aa101"} Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.766220 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"483ee338-f847-4a1f-b644-aaf67fcd8fee","Type":"ContainerStarted","Data":"e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c"} Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.766269 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"483ee338-f847-4a1f-b644-aaf67fcd8fee","Type":"ContainerStarted","Data":"c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7"} Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.766284 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"483ee338-f847-4a1f-b644-aaf67fcd8fee","Type":"ContainerStarted","Data":"520305eaafe4bae2782db5b5065ab350de090847f97aa66eb848f494ddb6ed45"} Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.784684 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.784661964 podStartE2EDuration="2.784661964s" podCreationTimestamp="2026-03-19 10:45:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:46.784257513 +0000 UTC m=+1445.133203065" watchObservedRunningTime="2026-03-19 10:45:46.784661964 +0000 UTC m=+1445.133607506" Mar 19 10:45:46 crc kubenswrapper[4765]: I0319 10:45:46.789254 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-cell1-novncproxy-0" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.013878 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-cell1-cell-mapping-prtgb"] Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.018069 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.021348 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-config-data" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.024707 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-manage-scripts" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.056794 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-prtgb"] Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.111464 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-scripts\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.111560 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c7xv6\" (UniqueName: \"kubernetes.io/projected/1669f945-1247-495f-b598-de4d6703a5cd-kube-api-access-c7xv6\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.111595 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-config-data\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.111627 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.215234 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-scripts\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.215364 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c7xv6\" (UniqueName: \"kubernetes.io/projected/1669f945-1247-495f-b598-de4d6703a5cd-kube-api-access-c7xv6\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.215423 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-config-data\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.215745 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.222779 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-combined-ca-bundle\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.224406 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-scripts\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.236723 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-config-data\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.252580 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c7xv6\" (UniqueName: \"kubernetes.io/projected/1669f945-1247-495f-b598-de4d6703a5cd-kube-api-access-c7xv6\") pod \"nova-cell1-cell-mapping-prtgb\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.365672 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.380230 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.380287 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.203:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:47 crc kubenswrapper[4765]: I0319 10:45:47.854368 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-cell1-cell-mapping-prtgb"] Mar 19 10:45:47 crc kubenswrapper[4765]: W0319 10:45:47.872702 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1669f945_1247_495f_b598_de4d6703a5cd.slice/crio-50a20cda9978e913e143bd2aaf939fcf7dadd491c46ddc83ebf22da75382d246 WatchSource:0}: Error finding container 50a20cda9978e913e143bd2aaf939fcf7dadd491c46ddc83ebf22da75382d246: Status 404 returned error can't find the container with id 50a20cda9978e913e143bd2aaf939fcf7dadd491c46ddc83ebf22da75382d246 Mar 19 10:45:48 crc kubenswrapper[4765]: I0319 10:45:48.653211 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:45:48 crc kubenswrapper[4765]: I0319 10:45:48.716220 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-26rbh"] Mar 19 10:45:48 crc kubenswrapper[4765]: I0319 10:45:48.716475 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" podUID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerName="dnsmasq-dns" containerID="cri-o://ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845" gracePeriod=10 Mar 19 10:45:48 crc kubenswrapper[4765]: I0319 10:45:48.804250 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prtgb" event={"ID":"1669f945-1247-495f-b598-de4d6703a5cd","Type":"ContainerStarted","Data":"36ccb5018079259f39da06eb2f26efb097b024f3fb278fd29fbcf63a153a015f"} Mar 19 10:45:48 crc kubenswrapper[4765]: I0319 10:45:48.804316 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prtgb" event={"ID":"1669f945-1247-495f-b598-de4d6703a5cd","Type":"ContainerStarted","Data":"50a20cda9978e913e143bd2aaf939fcf7dadd491c46ddc83ebf22da75382d246"} Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.194555 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.219778 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-cell1-cell-mapping-prtgb" podStartSLOduration=3.219756754 podStartE2EDuration="3.219756754s" podCreationTimestamp="2026-03-19 10:45:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:48.826311448 +0000 UTC m=+1447.175256990" watchObservedRunningTime="2026-03-19 10:45:49.219756754 +0000 UTC m=+1447.568702296" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.370387 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-nb\") pod \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.370490 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-swift-storage-0\") pod \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.370510 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-sb\") pod \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.370552 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-svc\") pod \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.370594 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4bl5d\" (UniqueName: \"kubernetes.io/projected/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-kube-api-access-4bl5d\") pod \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.370620 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-config\") pod \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\" (UID: \"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d\") " Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.380224 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-kube-api-access-4bl5d" (OuterVolumeSpecName: "kube-api-access-4bl5d") pod "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" (UID: "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d"). InnerVolumeSpecName "kube-api-access-4bl5d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.447106 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" (UID: "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.463036 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" (UID: "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.463480 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" (UID: "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.480735 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.480773 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.480786 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.480798 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4bl5d\" (UniqueName: \"kubernetes.io/projected/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-kube-api-access-4bl5d\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.481358 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-config" (OuterVolumeSpecName: "config") pod "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" (UID: "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.484615 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" (UID: "39a8a9e0-6620-4e67-9b23-7acd3a2aff5d"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.582550 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.582615 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.844809 4765 generic.go:334] "Generic (PLEG): container finished" podID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerID="ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845" exitCode=0 Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.845130 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" event={"ID":"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d","Type":"ContainerDied","Data":"ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845"} Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.846065 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" event={"ID":"39a8a9e0-6620-4e67-9b23-7acd3a2aff5d","Type":"ContainerDied","Data":"299c0466c94eeb8bfd55d5387030e2d130e6f61cf0e5bc3707be964495cadaf3"} Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.846107 4765 scope.go:117] "RemoveContainer" containerID="ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.845270 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-bccf8f775-26rbh" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.850743 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ceilometer-0" event={"ID":"5ff62acd-7f88-46c9-bd52-150092370b2d","Type":"ContainerStarted","Data":"d4434e1b424c76376a0c5ddf7c972fc92dffd66c74455e037dc65a50993a8422"} Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.850792 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/ceilometer-0" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.871920 4765 scope.go:117] "RemoveContainer" containerID="9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.882826 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ceilometer-0" podStartSLOduration=2.942073091 podStartE2EDuration="7.882802482s" podCreationTimestamp="2026-03-19 10:45:42 +0000 UTC" firstStartedPulling="2026-03-19 10:45:43.546655786 +0000 UTC m=+1441.895601328" lastFinishedPulling="2026-03-19 10:45:48.487385177 +0000 UTC m=+1446.836330719" observedRunningTime="2026-03-19 10:45:49.869576142 +0000 UTC m=+1448.218521694" watchObservedRunningTime="2026-03-19 10:45:49.882802482 +0000 UTC m=+1448.231748024" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.903057 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-26rbh"] Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.913427 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-bccf8f775-26rbh"] Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.939178 4765 scope.go:117] "RemoveContainer" containerID="ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845" Mar 19 10:45:49 crc kubenswrapper[4765]: E0319 10:45:49.975580 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845\": container with ID starting with ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845 not found: ID does not exist" containerID="ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.975644 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845"} err="failed to get container status \"ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845\": rpc error: code = NotFound desc = could not find container \"ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845\": container with ID starting with ab40e9c6c0495b21297f27b2f9dfb9064a146f0ae4022348f6abc259daca6845 not found: ID does not exist" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.975762 4765 scope.go:117] "RemoveContainer" containerID="9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5" Mar 19 10:45:49 crc kubenswrapper[4765]: E0319 10:45:49.978145 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5\": container with ID starting with 9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5 not found: ID does not exist" containerID="9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5" Mar 19 10:45:49 crc kubenswrapper[4765]: I0319 10:45:49.978220 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5"} err="failed to get container status \"9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5\": rpc error: code = NotFound desc = could not find container \"9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5\": container with ID starting with 9cd5ffafcf386aca9a16e09b726a5bdd6fa060debfbf7b4792da29ec781ef4a5 not found: ID does not exist" Mar 19 10:45:50 crc kubenswrapper[4765]: I0319 10:45:50.366285 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" path="/var/lib/kubelet/pods/39a8a9e0-6620-4e67-9b23-7acd3a2aff5d/volumes" Mar 19 10:45:53 crc kubenswrapper[4765]: I0319 10:45:53.898782 4765 generic.go:334] "Generic (PLEG): container finished" podID="1669f945-1247-495f-b598-de4d6703a5cd" containerID="36ccb5018079259f39da06eb2f26efb097b024f3fb278fd29fbcf63a153a015f" exitCode=0 Mar 19 10:45:53 crc kubenswrapper[4765]: I0319 10:45:53.898850 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prtgb" event={"ID":"1669f945-1247-495f-b598-de4d6703a5cd","Type":"ContainerDied","Data":"36ccb5018079259f39da06eb2f26efb097b024f3fb278fd29fbcf63a153a015f"} Mar 19 10:45:54 crc kubenswrapper[4765]: I0319 10:45:54.366865 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 10:45:54 crc kubenswrapper[4765]: I0319 10:45:54.366908 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.118812 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.119169 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.291951 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.396340 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-config-data\") pod \"1669f945-1247-495f-b598-de4d6703a5cd\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.396422 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c7xv6\" (UniqueName: \"kubernetes.io/projected/1669f945-1247-495f-b598-de4d6703a5cd-kube-api-access-c7xv6\") pod \"1669f945-1247-495f-b598-de4d6703a5cd\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.396535 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-combined-ca-bundle\") pod \"1669f945-1247-495f-b598-de4d6703a5cd\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.396591 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-scripts\") pod \"1669f945-1247-495f-b598-de4d6703a5cd\" (UID: \"1669f945-1247-495f-b598-de4d6703a5cd\") " Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.402373 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-scripts" (OuterVolumeSpecName: "scripts") pod "1669f945-1247-495f-b598-de4d6703a5cd" (UID: "1669f945-1247-495f-b598-de4d6703a5cd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.402416 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1669f945-1247-495f-b598-de4d6703a5cd-kube-api-access-c7xv6" (OuterVolumeSpecName: "kube-api-access-c7xv6") pod "1669f945-1247-495f-b598-de4d6703a5cd" (UID: "1669f945-1247-495f-b598-de4d6703a5cd"). InnerVolumeSpecName "kube-api-access-c7xv6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.424068 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1669f945-1247-495f-b598-de4d6703a5cd" (UID: "1669f945-1247-495f-b598-de4d6703a5cd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.428013 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-config-data" (OuterVolumeSpecName: "config-data") pod "1669f945-1247-495f-b598-de4d6703a5cd" (UID: "1669f945-1247-495f-b598-de4d6703a5cd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.498869 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.498927 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c7xv6\" (UniqueName: \"kubernetes.io/projected/1669f945-1247-495f-b598-de4d6703a5cd-kube-api-access-c7xv6\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.498943 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.498974 4765 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1669f945-1247-495f-b598-de4d6703a5cd-scripts\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.922622 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-cell1-cell-mapping-prtgb" event={"ID":"1669f945-1247-495f-b598-de4d6703a5cd","Type":"ContainerDied","Data":"50a20cda9978e913e143bd2aaf939fcf7dadd491c46ddc83ebf22da75382d246"} Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.922864 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="50a20cda9978e913e143bd2aaf939fcf7dadd491c46ddc83ebf22da75382d246" Mar 19 10:45:55 crc kubenswrapper[4765]: I0319 10:45:55.922647 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-cell1-cell-mapping-prtgb" Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.104526 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.121993 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.122297 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-scheduler-0" podUID="7c47edca-0f68-4cd2-ac44-b14fa99200bb" containerName="nova-scheduler-scheduler" containerID="cri-o://f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd" gracePeriod=30 Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.133167 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.133213 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.207:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.147822 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.148097 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-log" containerID="cri-o://d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855" gracePeriod=30 Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.148218 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-metadata-0" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-metadata" containerID="cri-o://a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e" gracePeriod=30 Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.939724 4765 generic.go:334] "Generic (PLEG): container finished" podID="09992c98-fab5-441f-b4a5-abc94d180a52" containerID="d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855" exitCode=143 Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.940323 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-log" containerID="cri-o://c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7" gracePeriod=30 Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.940409 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09992c98-fab5-441f-b4a5-abc94d180a52","Type":"ContainerDied","Data":"d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855"} Mar 19 10:45:56 crc kubenswrapper[4765]: I0319 10:45:56.940807 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/nova-api-0" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-api" containerID="cri-o://e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c" gracePeriod=30 Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.358736 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.471423 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4dxjt\" (UniqueName: \"kubernetes.io/projected/7c47edca-0f68-4cd2-ac44-b14fa99200bb-kube-api-access-4dxjt\") pod \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.471592 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-config-data\") pod \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.471653 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-combined-ca-bundle\") pod \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\" (UID: \"7c47edca-0f68-4cd2-ac44-b14fa99200bb\") " Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.484231 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c47edca-0f68-4cd2-ac44-b14fa99200bb-kube-api-access-4dxjt" (OuterVolumeSpecName: "kube-api-access-4dxjt") pod "7c47edca-0f68-4cd2-ac44-b14fa99200bb" (UID: "7c47edca-0f68-4cd2-ac44-b14fa99200bb"). InnerVolumeSpecName "kube-api-access-4dxjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.504662 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-config-data" (OuterVolumeSpecName: "config-data") pod "7c47edca-0f68-4cd2-ac44-b14fa99200bb" (UID: "7c47edca-0f68-4cd2-ac44-b14fa99200bb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.529181 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "7c47edca-0f68-4cd2-ac44-b14fa99200bb" (UID: "7c47edca-0f68-4cd2-ac44-b14fa99200bb"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.573783 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4dxjt\" (UniqueName: \"kubernetes.io/projected/7c47edca-0f68-4cd2-ac44-b14fa99200bb-kube-api-access-4dxjt\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.573817 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.573827 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/7c47edca-0f68-4cd2-ac44-b14fa99200bb-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.949360 4765 generic.go:334] "Generic (PLEG): container finished" podID="7c47edca-0f68-4cd2-ac44-b14fa99200bb" containerID="f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd" exitCode=0 Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.949424 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.949425 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c47edca-0f68-4cd2-ac44-b14fa99200bb","Type":"ContainerDied","Data":"f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd"} Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.950017 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"7c47edca-0f68-4cd2-ac44-b14fa99200bb","Type":"ContainerDied","Data":"34c37318a2ac974a7be40e4c7cda0461e32359b5801aadda42dd75a1c49f36dd"} Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.950041 4765 scope.go:117] "RemoveContainer" containerID="f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.952585 4765 generic.go:334] "Generic (PLEG): container finished" podID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerID="c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7" exitCode=143 Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.952622 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"483ee338-f847-4a1f-b644-aaf67fcd8fee","Type":"ContainerDied","Data":"c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7"} Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.971894 4765 scope.go:117] "RemoveContainer" containerID="f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd" Mar 19 10:45:57 crc kubenswrapper[4765]: E0319 10:45:57.976176 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd\": container with ID starting with f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd not found: ID does not exist" containerID="f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.976230 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd"} err="failed to get container status \"f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd\": rpc error: code = NotFound desc = could not find container \"f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd\": container with ID starting with f89b509721f854692c414377e6348e770a6a6fbbea4fd5cacdb78f94ba86cfbd not found: ID does not exist" Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.981368 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:57 crc kubenswrapper[4765]: I0319 10:45:57.997269 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.007809 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:58 crc kubenswrapper[4765]: E0319 10:45:58.008260 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c47edca-0f68-4cd2-ac44-b14fa99200bb" containerName="nova-scheduler-scheduler" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.008284 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c47edca-0f68-4cd2-ac44-b14fa99200bb" containerName="nova-scheduler-scheduler" Mar 19 10:45:58 crc kubenswrapper[4765]: E0319 10:45:58.008300 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerName="init" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.008306 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerName="init" Mar 19 10:45:58 crc kubenswrapper[4765]: E0319 10:45:58.008324 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1669f945-1247-495f-b598-de4d6703a5cd" containerName="nova-manage" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.008330 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1669f945-1247-495f-b598-de4d6703a5cd" containerName="nova-manage" Mar 19 10:45:58 crc kubenswrapper[4765]: E0319 10:45:58.008341 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerName="dnsmasq-dns" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.008346 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerName="dnsmasq-dns" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.008522 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="39a8a9e0-6620-4e67-9b23-7acd3a2aff5d" containerName="dnsmasq-dns" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.008543 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1669f945-1247-495f-b598-de4d6703a5cd" containerName="nova-manage" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.008564 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c47edca-0f68-4cd2-ac44-b14fa99200bb" containerName="nova-scheduler-scheduler" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.009267 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.011344 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-scheduler-config-data" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.018236 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.183629 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584d891c-1a52-4300-b19a-51a3594bdccf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.184440 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/584d891c-1a52-4300-b19a-51a3594bdccf-config-data\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.184592 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg697\" (UniqueName: \"kubernetes.io/projected/584d891c-1a52-4300-b19a-51a3594bdccf-kube-api-access-pg697\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.286203 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584d891c-1a52-4300-b19a-51a3594bdccf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.286332 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/584d891c-1a52-4300-b19a-51a3594bdccf-config-data\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.286390 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pg697\" (UniqueName: \"kubernetes.io/projected/584d891c-1a52-4300-b19a-51a3594bdccf-kube-api-access-pg697\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.293093 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/584d891c-1a52-4300-b19a-51a3594bdccf-combined-ca-bundle\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.293389 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/584d891c-1a52-4300-b19a-51a3594bdccf-config-data\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.308572 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg697\" (UniqueName: \"kubernetes.io/projected/584d891c-1a52-4300-b19a-51a3594bdccf-kube-api-access-pg697\") pod \"nova-scheduler-0\" (UID: \"584d891c-1a52-4300-b19a-51a3594bdccf\") " pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.324856 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-scheduler-0" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.378384 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c47edca-0f68-4cd2-ac44-b14fa99200bb" path="/var/lib/kubelet/pods/7c47edca-0f68-4cd2-ac44-b14fa99200bb/volumes" Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.733626 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-scheduler-0"] Mar 19 10:45:58 crc kubenswrapper[4765]: W0319 10:45:58.737068 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod584d891c_1a52_4300_b19a_51a3594bdccf.slice/crio-68ace1c7ef83dae90594d9ef2bea33351b5915e08b4962b45960fa06ac3ce086 WatchSource:0}: Error finding container 68ace1c7ef83dae90594d9ef2bea33351b5915e08b4962b45960fa06ac3ce086: Status 404 returned error can't find the container with id 68ace1c7ef83dae90594d9ef2bea33351b5915e08b4962b45960fa06ac3ce086 Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.966252 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"584d891c-1a52-4300-b19a-51a3594bdccf","Type":"ContainerStarted","Data":"0f9ffd9efce21c6653adba03020d8cf861fbb2808589bc5dec4049ab8ec69849"} Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.966488 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-scheduler-0" event={"ID":"584d891c-1a52-4300-b19a-51a3594bdccf","Type":"ContainerStarted","Data":"68ace1c7ef83dae90594d9ef2bea33351b5915e08b4962b45960fa06ac3ce086"} Mar 19 10:45:58 crc kubenswrapper[4765]: I0319 10:45:58.988589 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-scheduler-0" podStartSLOduration=1.9885718890000001 podStartE2EDuration="1.988571889s" podCreationTimestamp="2026-03-19 10:45:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:45:58.97904091 +0000 UTC m=+1457.327986452" watchObservedRunningTime="2026-03-19 10:45:58.988571889 +0000 UTC m=+1457.337517431" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.747505 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.923396 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-combined-ca-bundle\") pod \"09992c98-fab5-441f-b4a5-abc94d180a52\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.923500 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-config-data\") pod \"09992c98-fab5-441f-b4a5-abc94d180a52\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.923626 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09992c98-fab5-441f-b4a5-abc94d180a52-logs\") pod \"09992c98-fab5-441f-b4a5-abc94d180a52\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.923688 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ntbq5\" (UniqueName: \"kubernetes.io/projected/09992c98-fab5-441f-b4a5-abc94d180a52-kube-api-access-ntbq5\") pod \"09992c98-fab5-441f-b4a5-abc94d180a52\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.923767 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-nova-metadata-tls-certs\") pod \"09992c98-fab5-441f-b4a5-abc94d180a52\" (UID: \"09992c98-fab5-441f-b4a5-abc94d180a52\") " Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.924175 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/09992c98-fab5-441f-b4a5-abc94d180a52-logs" (OuterVolumeSpecName: "logs") pod "09992c98-fab5-441f-b4a5-abc94d180a52" (UID: "09992c98-fab5-441f-b4a5-abc94d180a52"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.924653 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/09992c98-fab5-441f-b4a5-abc94d180a52-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.930127 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09992c98-fab5-441f-b4a5-abc94d180a52-kube-api-access-ntbq5" (OuterVolumeSpecName: "kube-api-access-ntbq5") pod "09992c98-fab5-441f-b4a5-abc94d180a52" (UID: "09992c98-fab5-441f-b4a5-abc94d180a52"). InnerVolumeSpecName "kube-api-access-ntbq5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.955534 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "09992c98-fab5-441f-b4a5-abc94d180a52" (UID: "09992c98-fab5-441f-b4a5-abc94d180a52"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.976051 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-config-data" (OuterVolumeSpecName: "config-data") pod "09992c98-fab5-441f-b4a5-abc94d180a52" (UID: "09992c98-fab5-441f-b4a5-abc94d180a52"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.979623 4765 generic.go:334] "Generic (PLEG): container finished" podID="09992c98-fab5-441f-b4a5-abc94d180a52" containerID="a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e" exitCode=0 Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.979697 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.979732 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09992c98-fab5-441f-b4a5-abc94d180a52","Type":"ContainerDied","Data":"a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e"} Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.979779 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"09992c98-fab5-441f-b4a5-abc94d180a52","Type":"ContainerDied","Data":"949c31618f239689481e7cdfa83fe84d09518dd5a8d109ab2480cade682ccbe2"} Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.979804 4765 scope.go:117] "RemoveContainer" containerID="a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e" Mar 19 10:45:59 crc kubenswrapper[4765]: I0319 10:45:59.985102 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-nova-metadata-tls-certs" (OuterVolumeSpecName: "nova-metadata-tls-certs") pod "09992c98-fab5-441f-b4a5-abc94d180a52" (UID: "09992c98-fab5-441f-b4a5-abc94d180a52"). InnerVolumeSpecName "nova-metadata-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.003332 4765 scope.go:117] "RemoveContainer" containerID="d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.026162 4765 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-nova-metadata-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.026198 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.026218 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/09992c98-fab5-441f-b4a5-abc94d180a52-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.026231 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ntbq5\" (UniqueName: \"kubernetes.io/projected/09992c98-fab5-441f-b4a5-abc94d180a52-kube-api-access-ntbq5\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.044458 4765 scope.go:117] "RemoveContainer" containerID="a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e" Mar 19 10:46:00 crc kubenswrapper[4765]: E0319 10:46:00.045017 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e\": container with ID starting with a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e not found: ID does not exist" containerID="a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.045075 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e"} err="failed to get container status \"a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e\": rpc error: code = NotFound desc = could not find container \"a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e\": container with ID starting with a9df19d737101f6b88e6f349e59a5da9d4cb7a6810b1b2754333dfa4b450d85e not found: ID does not exist" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.045111 4765 scope.go:117] "RemoveContainer" containerID="d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855" Mar 19 10:46:00 crc kubenswrapper[4765]: E0319 10:46:00.045589 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855\": container with ID starting with d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855 not found: ID does not exist" containerID="d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.045618 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855"} err="failed to get container status \"d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855\": rpc error: code = NotFound desc = could not find container \"d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855\": container with ID starting with d99103fdab060d01459a8878ba8bcd33ce6202cccc3d69e13624c5e395469855 not found: ID does not exist" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.133245 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565286-cjfhr"] Mar 19 10:46:00 crc kubenswrapper[4765]: E0319 10:46:00.133744 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-log" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.133764 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-log" Mar 19 10:46:00 crc kubenswrapper[4765]: E0319 10:46:00.133780 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-metadata" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.133786 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-metadata" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.133996 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-metadata" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.134016 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" containerName="nova-metadata-log" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.134885 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-cjfhr" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.137880 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.138156 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.139028 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.144125 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-cjfhr"] Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.331670 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6xkpb\" (UniqueName: \"kubernetes.io/projected/741b1716-e212-46d2-b388-41d14845f2b0-kube-api-access-6xkpb\") pod \"auto-csr-approver-29565286-cjfhr\" (UID: \"741b1716-e212-46d2-b388-41d14845f2b0\") " pod="openshift-infra/auto-csr-approver-29565286-cjfhr" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.378427 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.390757 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.401653 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.403353 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.406973 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-metadata-internal-svc" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.407015 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-config-data" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.412841 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.434079 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6xkpb\" (UniqueName: \"kubernetes.io/projected/741b1716-e212-46d2-b388-41d14845f2b0-kube-api-access-6xkpb\") pod \"auto-csr-approver-29565286-cjfhr\" (UID: \"741b1716-e212-46d2-b388-41d14845f2b0\") " pod="openshift-infra/auto-csr-approver-29565286-cjfhr" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.451205 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6xkpb\" (UniqueName: \"kubernetes.io/projected/741b1716-e212-46d2-b388-41d14845f2b0-kube-api-access-6xkpb\") pod \"auto-csr-approver-29565286-cjfhr\" (UID: \"741b1716-e212-46d2-b388-41d14845f2b0\") " pod="openshift-infra/auto-csr-approver-29565286-cjfhr" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.459057 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-cjfhr" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.536283 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.536340 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5603f135-db39-4e98-b372-6ec55cbc3351-logs\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.536375 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-config-data\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.536440 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pm8qn\" (UniqueName: \"kubernetes.io/projected/5603f135-db39-4e98-b372-6ec55cbc3351-kube-api-access-pm8qn\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.536699 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.638486 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5603f135-db39-4e98-b372-6ec55cbc3351-logs\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.638541 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-config-data\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.638602 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pm8qn\" (UniqueName: \"kubernetes.io/projected/5603f135-db39-4e98-b372-6ec55cbc3351-kube-api-access-pm8qn\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.638675 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.638798 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.639271 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5603f135-db39-4e98-b372-6ec55cbc3351-logs\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.643819 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-tls-certs\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-nova-metadata-tls-certs\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.646256 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-config-data\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.656627 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5603f135-db39-4e98-b372-6ec55cbc3351-combined-ca-bundle\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.658612 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pm8qn\" (UniqueName: \"kubernetes.io/projected/5603f135-db39-4e98-b372-6ec55cbc3351-kube-api-access-pm8qn\") pod \"nova-metadata-0\" (UID: \"5603f135-db39-4e98-b372-6ec55cbc3351\") " pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.719794 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-metadata-0" Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.903485 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-cjfhr"] Mar 19 10:46:00 crc kubenswrapper[4765]: W0319 10:46:00.907928 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod741b1716_e212_46d2_b388_41d14845f2b0.slice/crio-ca578bf7322aa9a0022193541f8eb220cae42efad02f41ec53be66b8e3153d28 WatchSource:0}: Error finding container ca578bf7322aa9a0022193541f8eb220cae42efad02f41ec53be66b8e3153d28: Status 404 returned error can't find the container with id ca578bf7322aa9a0022193541f8eb220cae42efad02f41ec53be66b8e3153d28 Mar 19 10:46:00 crc kubenswrapper[4765]: I0319 10:46:00.990979 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565286-cjfhr" event={"ID":"741b1716-e212-46d2-b388-41d14845f2b0","Type":"ContainerStarted","Data":"ca578bf7322aa9a0022193541f8eb220cae42efad02f41ec53be66b8e3153d28"} Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.157512 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-metadata-0"] Mar 19 10:46:01 crc kubenswrapper[4765]: W0319 10:46:01.162090 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5603f135_db39_4e98_b372_6ec55cbc3351.slice/crio-672717d934c156af5347a7d8efc1e05790e02192aa81950327747357e02fa901 WatchSource:0}: Error finding container 672717d934c156af5347a7d8efc1e05790e02192aa81950327747357e02fa901: Status 404 returned error can't find the container with id 672717d934c156af5347a7d8efc1e05790e02192aa81950327747357e02fa901 Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.656287 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.656874 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.701199 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.868165 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-combined-ca-bundle\") pod \"483ee338-f847-4a1f-b644-aaf67fcd8fee\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.868339 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483ee338-f847-4a1f-b644-aaf67fcd8fee-logs\") pod \"483ee338-f847-4a1f-b644-aaf67fcd8fee\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.868361 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-config-data\") pod \"483ee338-f847-4a1f-b644-aaf67fcd8fee\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.868430 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-public-tls-certs\") pod \"483ee338-f847-4a1f-b644-aaf67fcd8fee\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.868506 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bvhhz\" (UniqueName: \"kubernetes.io/projected/483ee338-f847-4a1f-b644-aaf67fcd8fee-kube-api-access-bvhhz\") pod \"483ee338-f847-4a1f-b644-aaf67fcd8fee\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.868530 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-internal-tls-certs\") pod \"483ee338-f847-4a1f-b644-aaf67fcd8fee\" (UID: \"483ee338-f847-4a1f-b644-aaf67fcd8fee\") " Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.868779 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/483ee338-f847-4a1f-b644-aaf67fcd8fee-logs" (OuterVolumeSpecName: "logs") pod "483ee338-f847-4a1f-b644-aaf67fcd8fee" (UID: "483ee338-f847-4a1f-b644-aaf67fcd8fee"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.869314 4765 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/483ee338-f847-4a1f-b644-aaf67fcd8fee-logs\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.874184 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/483ee338-f847-4a1f-b644-aaf67fcd8fee-kube-api-access-bvhhz" (OuterVolumeSpecName: "kube-api-access-bvhhz") pod "483ee338-f847-4a1f-b644-aaf67fcd8fee" (UID: "483ee338-f847-4a1f-b644-aaf67fcd8fee"). InnerVolumeSpecName "kube-api-access-bvhhz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.902071 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-config-data" (OuterVolumeSpecName: "config-data") pod "483ee338-f847-4a1f-b644-aaf67fcd8fee" (UID: "483ee338-f847-4a1f-b644-aaf67fcd8fee"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.906540 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "483ee338-f847-4a1f-b644-aaf67fcd8fee" (UID: "483ee338-f847-4a1f-b644-aaf67fcd8fee"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.922593 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "483ee338-f847-4a1f-b644-aaf67fcd8fee" (UID: "483ee338-f847-4a1f-b644-aaf67fcd8fee"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.924822 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "483ee338-f847-4a1f-b644-aaf67fcd8fee" (UID: "483ee338-f847-4a1f-b644-aaf67fcd8fee"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.971093 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.971160 4765 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-public-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.971181 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bvhhz\" (UniqueName: \"kubernetes.io/projected/483ee338-f847-4a1f-b644-aaf67fcd8fee-kube-api-access-bvhhz\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.971216 4765 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:01 crc kubenswrapper[4765]: I0319 10:46:01.971227 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/483ee338-f847-4a1f-b644-aaf67fcd8fee-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.003594 4765 generic.go:334] "Generic (PLEG): container finished" podID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerID="e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c" exitCode=0 Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.003641 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.003670 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"483ee338-f847-4a1f-b644-aaf67fcd8fee","Type":"ContainerDied","Data":"e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c"} Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.003700 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"483ee338-f847-4a1f-b644-aaf67fcd8fee","Type":"ContainerDied","Data":"520305eaafe4bae2782db5b5065ab350de090847f97aa66eb848f494ddb6ed45"} Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.003718 4765 scope.go:117] "RemoveContainer" containerID="e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.010861 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5603f135-db39-4e98-b372-6ec55cbc3351","Type":"ContainerStarted","Data":"865b2c8c0335524fbc4504e8dc9b206f90787568263c4860fff32006f9fbfcdb"} Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.010913 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5603f135-db39-4e98-b372-6ec55cbc3351","Type":"ContainerStarted","Data":"fab49bd128ed84d8a075287ed553ef0dcc3f04179a277a3f53bd8914c0990dd4"} Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.010925 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-metadata-0" event={"ID":"5603f135-db39-4e98-b372-6ec55cbc3351","Type":"ContainerStarted","Data":"672717d934c156af5347a7d8efc1e05790e02192aa81950327747357e02fa901"} Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.024194 4765 scope.go:117] "RemoveContainer" containerID="c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.045682 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-metadata-0" podStartSLOduration=2.04564675 podStartE2EDuration="2.04564675s" podCreationTimestamp="2026-03-19 10:46:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:46:02.027440895 +0000 UTC m=+1460.376386437" watchObservedRunningTime="2026-03-19 10:46:02.04564675 +0000 UTC m=+1460.394592302" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.067135 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.070257 4765 scope.go:117] "RemoveContainer" containerID="e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c" Mar 19 10:46:02 crc kubenswrapper[4765]: E0319 10:46:02.074583 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c\": container with ID starting with e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c not found: ID does not exist" containerID="e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.074625 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c"} err="failed to get container status \"e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c\": rpc error: code = NotFound desc = could not find container \"e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c\": container with ID starting with e749b1ce2540cbf495c6ceaccd8d239894b109cc8c5b4a6d53fb78ef35145f9c not found: ID does not exist" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.074651 4765 scope.go:117] "RemoveContainer" containerID="c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7" Mar 19 10:46:02 crc kubenswrapper[4765]: E0319 10:46:02.078731 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7\": container with ID starting with c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7 not found: ID does not exist" containerID="c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.078773 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7"} err="failed to get container status \"c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7\": rpc error: code = NotFound desc = could not find container \"c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7\": container with ID starting with c9fe9d7aa7ac53d67e5acba36d2f2554e1e97d662eace2dee6870b0037858fa7 not found: ID does not exist" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.087054 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.095708 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-api-0"] Mar 19 10:46:02 crc kubenswrapper[4765]: E0319 10:46:02.096103 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-log" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.096117 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-log" Mar 19 10:46:02 crc kubenswrapper[4765]: E0319 10:46:02.096154 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-api" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.096161 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-api" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.096330 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-log" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.096358 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" containerName="nova-api-api" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.097246 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.100214 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-api-config-data" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.100236 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-internal-svc" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.100269 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-nova-public-svc" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.126566 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.276783 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.277203 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.277303 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b32fc33-5dc9-44b4-9313-1ad458fe9473-logs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.277404 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-config-data\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.277591 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.277692 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7qd5\" (UniqueName: \"kubernetes.io/projected/4b32fc33-5dc9-44b4-9313-1ad458fe9473-kube-api-access-t7qd5\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.380615 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.380778 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b32fc33-5dc9-44b4-9313-1ad458fe9473-logs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.380832 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-config-data\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.380899 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.380932 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t7qd5\" (UniqueName: \"kubernetes.io/projected/4b32fc33-5dc9-44b4-9313-1ad458fe9473-kube-api-access-t7qd5\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.381015 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.380915 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09992c98-fab5-441f-b4a5-abc94d180a52" path="/var/lib/kubelet/pods/09992c98-fab5-441f-b4a5-abc94d180a52/volumes" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.383007 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="483ee338-f847-4a1f-b644-aaf67fcd8fee" path="/var/lib/kubelet/pods/483ee338-f847-4a1f-b644-aaf67fcd8fee/volumes" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.385042 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/4b32fc33-5dc9-44b4-9313-1ad458fe9473-logs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.387230 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-combined-ca-bundle\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.387810 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-config-data\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.387939 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-public-tls-certs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.389484 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/4b32fc33-5dc9-44b4-9313-1ad458fe9473-internal-tls-certs\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.398683 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7qd5\" (UniqueName: \"kubernetes.io/projected/4b32fc33-5dc9-44b4-9313-1ad458fe9473-kube-api-access-t7qd5\") pod \"nova-api-0\" (UID: \"4b32fc33-5dc9-44b4-9313-1ad458fe9473\") " pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.428334 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-api-0" Mar 19 10:46:02 crc kubenswrapper[4765]: I0319 10:46:02.874662 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-api-0"] Mar 19 10:46:03 crc kubenswrapper[4765]: I0319 10:46:03.022012 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b32fc33-5dc9-44b4-9313-1ad458fe9473","Type":"ContainerStarted","Data":"37077179a5728a7e3b7ffe18e914f81857616cbe82b4f21fc371477558f9bbad"} Mar 19 10:46:03 crc kubenswrapper[4765]: I0319 10:46:03.023471 4765 generic.go:334] "Generic (PLEG): container finished" podID="741b1716-e212-46d2-b388-41d14845f2b0" containerID="0daa0cf7d801aba08b6255a5d3f9d5ad1fb6f6a7bba78b1fc71d24cf3afa469e" exitCode=0 Mar 19 10:46:03 crc kubenswrapper[4765]: I0319 10:46:03.023535 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565286-cjfhr" event={"ID":"741b1716-e212-46d2-b388-41d14845f2b0","Type":"ContainerDied","Data":"0daa0cf7d801aba08b6255a5d3f9d5ad1fb6f6a7bba78b1fc71d24cf3afa469e"} Mar 19 10:46:03 crc kubenswrapper[4765]: I0319 10:46:03.325245 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-scheduler-0" Mar 19 10:46:04 crc kubenswrapper[4765]: I0319 10:46:04.035111 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b32fc33-5dc9-44b4-9313-1ad458fe9473","Type":"ContainerStarted","Data":"5d1f2a9850fb075e002efc755c5d57cb06fd722418693758ea5a94df75f5b48e"} Mar 19 10:46:04 crc kubenswrapper[4765]: I0319 10:46:04.035455 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-api-0" event={"ID":"4b32fc33-5dc9-44b4-9313-1ad458fe9473","Type":"ContainerStarted","Data":"594a8ce5bd968c2155e5dd08e1f3452c17207ef83140ea182f795aec780571ce"} Mar 19 10:46:04 crc kubenswrapper[4765]: I0319 10:46:04.056129 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-api-0" podStartSLOduration=2.056110395 podStartE2EDuration="2.056110395s" podCreationTimestamp="2026-03-19 10:46:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:46:04.054259915 +0000 UTC m=+1462.403205477" watchObservedRunningTime="2026-03-19 10:46:04.056110395 +0000 UTC m=+1462.405055937" Mar 19 10:46:04 crc kubenswrapper[4765]: I0319 10:46:04.371415 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-cjfhr" Mar 19 10:46:04 crc kubenswrapper[4765]: I0319 10:46:04.519807 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6xkpb\" (UniqueName: \"kubernetes.io/projected/741b1716-e212-46d2-b388-41d14845f2b0-kube-api-access-6xkpb\") pod \"741b1716-e212-46d2-b388-41d14845f2b0\" (UID: \"741b1716-e212-46d2-b388-41d14845f2b0\") " Mar 19 10:46:04 crc kubenswrapper[4765]: I0319 10:46:04.524930 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/741b1716-e212-46d2-b388-41d14845f2b0-kube-api-access-6xkpb" (OuterVolumeSpecName: "kube-api-access-6xkpb") pod "741b1716-e212-46d2-b388-41d14845f2b0" (UID: "741b1716-e212-46d2-b388-41d14845f2b0"). InnerVolumeSpecName "kube-api-access-6xkpb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:04 crc kubenswrapper[4765]: I0319 10:46:04.622390 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6xkpb\" (UniqueName: \"kubernetes.io/projected/741b1716-e212-46d2-b388-41d14845f2b0-kube-api-access-6xkpb\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:05 crc kubenswrapper[4765]: I0319 10:46:05.045591 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565286-cjfhr" event={"ID":"741b1716-e212-46d2-b388-41d14845f2b0","Type":"ContainerDied","Data":"ca578bf7322aa9a0022193541f8eb220cae42efad02f41ec53be66b8e3153d28"} Mar 19 10:46:05 crc kubenswrapper[4765]: I0319 10:46:05.045652 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca578bf7322aa9a0022193541f8eb220cae42efad02f41ec53be66b8e3153d28" Mar 19 10:46:05 crc kubenswrapper[4765]: I0319 10:46:05.045618 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565286-cjfhr" Mar 19 10:46:05 crc kubenswrapper[4765]: I0319 10:46:05.441303 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-nf5hg"] Mar 19 10:46:05 crc kubenswrapper[4765]: I0319 10:46:05.454034 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565280-nf5hg"] Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.236513 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-prwdk"] Mar 19 10:46:06 crc kubenswrapper[4765]: E0319 10:46:06.237068 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="741b1716-e212-46d2-b388-41d14845f2b0" containerName="oc" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.237083 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="741b1716-e212-46d2-b388-41d14845f2b0" containerName="oc" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.237331 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="741b1716-e212-46d2-b388-41d14845f2b0" containerName="oc" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.239013 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.256736 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prwdk"] Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.367470 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="823c3d08-1e08-4fba-b6c7-8591036f93bc" path="/var/lib/kubelet/pods/823c3d08-1e08-4fba-b6c7-8591036f93bc/volumes" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.370987 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-catalog-content\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.371082 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h2xvh\" (UniqueName: \"kubernetes.io/projected/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-kube-api-access-h2xvh\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.371188 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-utilities\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.473347 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-utilities\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.473924 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-catalog-content\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.473983 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-utilities\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.474181 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h2xvh\" (UniqueName: \"kubernetes.io/projected/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-kube-api-access-h2xvh\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.474453 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-catalog-content\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.494987 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h2xvh\" (UniqueName: \"kubernetes.io/projected/b99b05c7-b9bf-4814-9d29-b5d9076a98a8-kube-api-access-h2xvh\") pod \"redhat-operators-prwdk\" (UID: \"b99b05c7-b9bf-4814-9d29-b5d9076a98a8\") " pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:06 crc kubenswrapper[4765]: I0319 10:46:06.561925 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:07 crc kubenswrapper[4765]: I0319 10:46:07.034664 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prwdk"] Mar 19 10:46:07 crc kubenswrapper[4765]: W0319 10:46:07.035714 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb99b05c7_b9bf_4814_9d29_b5d9076a98a8.slice/crio-2c3aa0dccaf3433d57fd453e1e6e90abc09ff6bba4b9dfd5f4f0b0b61ecd951e WatchSource:0}: Error finding container 2c3aa0dccaf3433d57fd453e1e6e90abc09ff6bba4b9dfd5f4f0b0b61ecd951e: Status 404 returned error can't find the container with id 2c3aa0dccaf3433d57fd453e1e6e90abc09ff6bba4b9dfd5f4f0b0b61ecd951e Mar 19 10:46:07 crc kubenswrapper[4765]: I0319 10:46:07.061853 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prwdk" event={"ID":"b99b05c7-b9bf-4814-9d29-b5d9076a98a8","Type":"ContainerStarted","Data":"2c3aa0dccaf3433d57fd453e1e6e90abc09ff6bba4b9dfd5f4f0b0b61ecd951e"} Mar 19 10:46:08 crc kubenswrapper[4765]: I0319 10:46:08.071376 4765 generic.go:334] "Generic (PLEG): container finished" podID="b99b05c7-b9bf-4814-9d29-b5d9076a98a8" containerID="3cfab37ce36a5e8e1a40f5c717761143f79256bb60bda687dea3972cd0804557" exitCode=0 Mar 19 10:46:08 crc kubenswrapper[4765]: I0319 10:46:08.071655 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prwdk" event={"ID":"b99b05c7-b9bf-4814-9d29-b5d9076a98a8","Type":"ContainerDied","Data":"3cfab37ce36a5e8e1a40f5c717761143f79256bb60bda687dea3972cd0804557"} Mar 19 10:46:08 crc kubenswrapper[4765]: I0319 10:46:08.325847 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-scheduler-0" Mar 19 10:46:08 crc kubenswrapper[4765]: I0319 10:46:08.352754 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-scheduler-0" Mar 19 10:46:09 crc kubenswrapper[4765]: I0319 10:46:09.106044 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-scheduler-0" Mar 19 10:46:10 crc kubenswrapper[4765]: I0319 10:46:10.720433 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 10:46:10 crc kubenswrapper[4765]: I0319 10:46:10.720498 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-metadata-0" Mar 19 10:46:11 crc kubenswrapper[4765]: I0319 10:46:11.740100 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5603f135-db39-4e98-b372-6ec55cbc3351" containerName="nova-metadata-log" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:46:11 crc kubenswrapper[4765]: I0319 10:46:11.740100 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-metadata-0" podUID="5603f135-db39-4e98-b372-6ec55cbc3351" containerName="nova-metadata-metadata" probeResult="failure" output="Get \"https://10.217.0.211:8775/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:46:12 crc kubenswrapper[4765]: I0319 10:46:12.428831 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:46:12 crc kubenswrapper[4765]: I0319 10:46:12.429997 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack/nova-api-0" Mar 19 10:46:13 crc kubenswrapper[4765]: I0319 10:46:13.089404 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/ceilometer-0" Mar 19 10:46:13 crc kubenswrapper[4765]: I0319 10:46:13.443245 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4b32fc33-5dc9-44b4-9313-1ad458fe9473" containerName="nova-api-api" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:46:13 crc kubenswrapper[4765]: I0319 10:46:13.443357 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openstack/nova-api-0" podUID="4b32fc33-5dc9-44b4-9313-1ad458fe9473" containerName="nova-api-log" probeResult="failure" output="Get \"https://10.217.0.212:8774/\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Mar 19 10:46:18 crc kubenswrapper[4765]: I0319 10:46:18.184133 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prwdk" event={"ID":"b99b05c7-b9bf-4814-9d29-b5d9076a98a8","Type":"ContainerStarted","Data":"9e2561a7103a2bf5a61f1629df130a7c453789bc1bb27e614f1f5a8a10de7c3a"} Mar 19 10:46:18 crc kubenswrapper[4765]: I0319 10:46:18.720361 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 10:46:18 crc kubenswrapper[4765]: I0319 10:46:18.720718 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-metadata-0" Mar 19 10:46:20 crc kubenswrapper[4765]: I0319 10:46:20.429347 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 10:46:20 crc kubenswrapper[4765]: I0319 10:46:20.429405 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/nova-api-0" Mar 19 10:46:20 crc kubenswrapper[4765]: I0319 10:46:20.725701 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 10:46:20 crc kubenswrapper[4765]: I0319 10:46:20.727513 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-metadata-0" Mar 19 10:46:20 crc kubenswrapper[4765]: I0319 10:46:20.730585 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 10:46:21 crc kubenswrapper[4765]: I0319 10:46:21.212786 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-metadata-0" Mar 19 10:46:22 crc kubenswrapper[4765]: I0319 10:46:22.221816 4765 generic.go:334] "Generic (PLEG): container finished" podID="b99b05c7-b9bf-4814-9d29-b5d9076a98a8" containerID="9e2561a7103a2bf5a61f1629df130a7c453789bc1bb27e614f1f5a8a10de7c3a" exitCode=0 Mar 19 10:46:22 crc kubenswrapper[4765]: I0319 10:46:22.223525 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prwdk" event={"ID":"b99b05c7-b9bf-4814-9d29-b5d9076a98a8","Type":"ContainerDied","Data":"9e2561a7103a2bf5a61f1629df130a7c453789bc1bb27e614f1f5a8a10de7c3a"} Mar 19 10:46:22 crc kubenswrapper[4765]: I0319 10:46:22.435983 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 10:46:22 crc kubenswrapper[4765]: I0319 10:46:22.441870 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 10:46:22 crc kubenswrapper[4765]: I0319 10:46:22.491385 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack/nova-api-0" Mar 19 10:46:23 crc kubenswrapper[4765]: I0319 10:46:23.237684 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/nova-api-0" Mar 19 10:46:24 crc kubenswrapper[4765]: I0319 10:46:24.240115 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-prwdk" event={"ID":"b99b05c7-b9bf-4814-9d29-b5d9076a98a8","Type":"ContainerStarted","Data":"f5ba38515c3b82b703fb3cbd7cc99ec7c75f09922052906266e4c97b2f542342"} Mar 19 10:46:24 crc kubenswrapper[4765]: I0319 10:46:24.267226 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-prwdk" podStartSLOduration=2.94885787 podStartE2EDuration="18.267204918s" podCreationTimestamp="2026-03-19 10:46:06 +0000 UTC" firstStartedPulling="2026-03-19 10:46:08.073537041 +0000 UTC m=+1466.422482573" lastFinishedPulling="2026-03-19 10:46:23.391884079 +0000 UTC m=+1481.740829621" observedRunningTime="2026-03-19 10:46:24.259721114 +0000 UTC m=+1482.608666666" watchObservedRunningTime="2026-03-19 10:46:24.267204918 +0000 UTC m=+1482.616150460" Mar 19 10:46:26 crc kubenswrapper[4765]: I0319 10:46:26.562543 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:26 crc kubenswrapper[4765]: I0319 10:46:26.562862 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:27 crc kubenswrapper[4765]: I0319 10:46:27.618776 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-prwdk" podUID="b99b05c7-b9bf-4814-9d29-b5d9076a98a8" containerName="registry-server" probeResult="failure" output=< Mar 19 10:46:27 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 10:46:27 crc kubenswrapper[4765]: > Mar 19 10:46:30 crc kubenswrapper[4765]: I0319 10:46:30.811826 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:46:31 crc kubenswrapper[4765]: I0319 10:46:31.656709 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:46:31 crc kubenswrapper[4765]: I0319 10:46:31.657015 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:46:31 crc kubenswrapper[4765]: I0319 10:46:31.657135 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:46:31 crc kubenswrapper[4765]: I0319 10:46:31.657917 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7fbbabc77237677f702271306a25be40ef78a15b44ac1218092fa412c82ce0c1"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:46:31 crc kubenswrapper[4765]: I0319 10:46:31.658100 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://7fbbabc77237677f702271306a25be40ef78a15b44ac1218092fa412c82ce0c1" gracePeriod=600 Mar 19 10:46:31 crc kubenswrapper[4765]: I0319 10:46:31.931182 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:46:32 crc kubenswrapper[4765]: I0319 10:46:32.315011 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="7fbbabc77237677f702271306a25be40ef78a15b44ac1218092fa412c82ce0c1" exitCode=0 Mar 19 10:46:32 crc kubenswrapper[4765]: I0319 10:46:32.315071 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"7fbbabc77237677f702271306a25be40ef78a15b44ac1218092fa412c82ce0c1"} Mar 19 10:46:32 crc kubenswrapper[4765]: I0319 10:46:32.315137 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984"} Mar 19 10:46:32 crc kubenswrapper[4765]: I0319 10:46:32.315288 4765 scope.go:117] "RemoveContainer" containerID="f15a48ff831f92a999a822adb51bf1e4ef1ab9b4cad221adcbd0787b32c65b85" Mar 19 10:46:35 crc kubenswrapper[4765]: I0319 10:46:35.257356 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-server-0" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerName="rabbitmq" containerID="cri-o://da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9" gracePeriod=604796 Mar 19 10:46:36 crc kubenswrapper[4765]: I0319 10:46:36.012607 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/rabbitmq-cell1-server-0" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerName="rabbitmq" containerID="cri-o://ac4ca5b206e9277cd39173f5dfd422276689843a7fc0ce7c0e0c3a65f43ef637" gracePeriod=604796 Mar 19 10:46:36 crc kubenswrapper[4765]: I0319 10:46:36.625180 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:36 crc kubenswrapper[4765]: I0319 10:46:36.677104 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-prwdk" Mar 19 10:46:37 crc kubenswrapper[4765]: I0319 10:46:37.285773 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-prwdk"] Mar 19 10:46:37 crc kubenswrapper[4765]: I0319 10:46:37.452802 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2rb7"] Mar 19 10:46:37 crc kubenswrapper[4765]: I0319 10:46:37.453107 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-d2rb7" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="registry-server" containerID="cri-o://8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a" gracePeriod=2 Mar 19 10:46:37 crc kubenswrapper[4765]: I0319 10:46:37.958854 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.107705 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-utilities\") pod \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.107847 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gl84n\" (UniqueName: \"kubernetes.io/projected/bb857f38-edd5-4cd5-9004-3f1737f6aec8-kube-api-access-gl84n\") pod \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.108030 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-catalog-content\") pod \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\" (UID: \"bb857f38-edd5-4cd5-9004-3f1737f6aec8\") " Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.108919 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-utilities" (OuterVolumeSpecName: "utilities") pod "bb857f38-edd5-4cd5-9004-3f1737f6aec8" (UID: "bb857f38-edd5-4cd5-9004-3f1737f6aec8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.114399 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bb857f38-edd5-4cd5-9004-3f1737f6aec8-kube-api-access-gl84n" (OuterVolumeSpecName: "kube-api-access-gl84n") pod "bb857f38-edd5-4cd5-9004-3f1737f6aec8" (UID: "bb857f38-edd5-4cd5-9004-3f1737f6aec8"). InnerVolumeSpecName "kube-api-access-gl84n". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.210224 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.210453 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gl84n\" (UniqueName: \"kubernetes.io/projected/bb857f38-edd5-4cd5-9004-3f1737f6aec8-kube-api-access-gl84n\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.277769 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "bb857f38-edd5-4cd5-9004-3f1737f6aec8" (UID: "bb857f38-edd5-4cd5-9004-3f1737f6aec8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.312200 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/bb857f38-edd5-4cd5-9004-3f1737f6aec8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.380457 4765 generic.go:334] "Generic (PLEG): container finished" podID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerID="8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a" exitCode=0 Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.380541 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-d2rb7" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.380682 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2rb7" event={"ID":"bb857f38-edd5-4cd5-9004-3f1737f6aec8","Type":"ContainerDied","Data":"8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a"} Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.380847 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-d2rb7" event={"ID":"bb857f38-edd5-4cd5-9004-3f1737f6aec8","Type":"ContainerDied","Data":"43689b0ad00a34ff1e647d1aaf15c888c472e853dc4275c834102ae1504acc1d"} Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.380951 4765 scope.go:117] "RemoveContainer" containerID="8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.406803 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-d2rb7"] Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.412667 4765 scope.go:117] "RemoveContainer" containerID="9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.416104 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-d2rb7"] Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.454027 4765 scope.go:117] "RemoveContainer" containerID="b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.482180 4765 scope.go:117] "RemoveContainer" containerID="8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a" Mar 19 10:46:38 crc kubenswrapper[4765]: E0319 10:46:38.482564 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a\": container with ID starting with 8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a not found: ID does not exist" containerID="8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.482603 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a"} err="failed to get container status \"8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a\": rpc error: code = NotFound desc = could not find container \"8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a\": container with ID starting with 8991e990eaaed4087cd59e908a50204b83e97382c62e738a03cfff181042688a not found: ID does not exist" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.482631 4765 scope.go:117] "RemoveContainer" containerID="9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3" Mar 19 10:46:38 crc kubenswrapper[4765]: E0319 10:46:38.483026 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3\": container with ID starting with 9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3 not found: ID does not exist" containerID="9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.483150 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3"} err="failed to get container status \"9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3\": rpc error: code = NotFound desc = could not find container \"9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3\": container with ID starting with 9238cbf791b7ea6b206433db3c57d0ba41a243c4eb691244cdc45b2d317ecfc3 not found: ID does not exist" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.483244 4765 scope.go:117] "RemoveContainer" containerID="b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44" Mar 19 10:46:38 crc kubenswrapper[4765]: E0319 10:46:38.483652 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44\": container with ID starting with b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44 not found: ID does not exist" containerID="b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44" Mar 19 10:46:38 crc kubenswrapper[4765]: I0319 10:46:38.483690 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44"} err="failed to get container status \"b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44\": rpc error: code = NotFound desc = could not find container \"b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44\": container with ID starting with b8bb7ef1d888e2b9b8b4b67924c001f657c9881109974e97093e5447c325fc44 not found: ID does not exist" Mar 19 10:46:40 crc kubenswrapper[4765]: I0319 10:46:40.368688 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" path="/var/lib/kubelet/pods/bb857f38-edd5-4cd5-9004-3f1737f6aec8/volumes" Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.830016 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.977625 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-erlang-cookie\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.977912 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-server-conf\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978022 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8np94\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-kube-api-access-8np94\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978137 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-config-data\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978235 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2190a046-0d52-49c7-b2fd-aa113c2f3f99-erlang-cookie-secret\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978344 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2190a046-0d52-49c7-b2fd-aa113c2f3f99-pod-info\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978500 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-plugins-conf\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978588 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-tls\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978675 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-confd\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978801 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-plugins\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.978907 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\" (UID: \"2190a046-0d52-49c7-b2fd-aa113c2f3f99\") " Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.981386 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.981506 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.981967 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.985567 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2190a046-0d52-49c7-b2fd-aa113c2f3f99-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:46:41 crc kubenswrapper[4765]: I0319 10:46:41.986752 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "persistence") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.002189 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/2190a046-0d52-49c7-b2fd-aa113c2f3f99-pod-info" (OuterVolumeSpecName: "pod-info") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.003090 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.003584 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-kube-api-access-8np94" (OuterVolumeSpecName: "kube-api-access-8np94") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "kube-api-access-8np94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.010068 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-config-data" (OuterVolumeSpecName: "config-data") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.045595 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-server-conf" (OuterVolumeSpecName: "server-conf") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.082616 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.082845 4765 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/2190a046-0d52-49c7-b2fd-aa113c2f3f99-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.082991 4765 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/2190a046-0d52-49c7-b2fd-aa113c2f3f99-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.083072 4765 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.083142 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.083227 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.083319 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.083392 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.083471 4765 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/2190a046-0d52-49c7-b2fd-aa113c2f3f99-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.083549 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8np94\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-kube-api-access-8np94\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.105070 4765 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.123724 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "2190a046-0d52-49c7-b2fd-aa113c2f3f99" (UID: "2190a046-0d52-49c7-b2fd-aa113c2f3f99"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.185007 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/2190a046-0d52-49c7-b2fd-aa113c2f3f99-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.185059 4765 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.421931 4765 generic.go:334] "Generic (PLEG): container finished" podID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerID="da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9" exitCode=0 Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.422073 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.422096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2190a046-0d52-49c7-b2fd-aa113c2f3f99","Type":"ContainerDied","Data":"da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9"} Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.422890 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"2190a046-0d52-49c7-b2fd-aa113c2f3f99","Type":"ContainerDied","Data":"df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d"} Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.422926 4765 scope.go:117] "RemoveContainer" containerID="da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.428335 4765 generic.go:334] "Generic (PLEG): container finished" podID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerID="ac4ca5b206e9277cd39173f5dfd422276689843a7fc0ce7c0e0c3a65f43ef637" exitCode=0 Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.428382 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ccdb0a31-8b87-4024-848f-efebcf46e604","Type":"ContainerDied","Data":"ac4ca5b206e9277cd39173f5dfd422276689843a7fc0ce7c0e0c3a65f43ef637"} Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.473436 4765 scope.go:117] "RemoveContainer" containerID="211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.475023 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.486620 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508026 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:46:42 crc kubenswrapper[4765]: E0319 10:46:42.508484 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerName="setup-container" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508506 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerName="setup-container" Mar 19 10:46:42 crc kubenswrapper[4765]: E0319 10:46:42.508539 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="extract-utilities" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508547 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="extract-utilities" Mar 19 10:46:42 crc kubenswrapper[4765]: E0319 10:46:42.508572 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="extract-content" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508580 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="extract-content" Mar 19 10:46:42 crc kubenswrapper[4765]: E0319 10:46:42.508596 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerName="rabbitmq" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508604 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerName="rabbitmq" Mar 19 10:46:42 crc kubenswrapper[4765]: E0319 10:46:42.508615 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="registry-server" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508622 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="registry-server" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508826 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" containerName="rabbitmq" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.508848 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="bb857f38-edd5-4cd5-9004-3f1737f6aec8" containerName="registry-server" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.510006 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.520582 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-erlang-cookie" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.520599 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-svc" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.520805 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-plugins-conf" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.521685 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-default-user" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.521931 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-server-dockercfg-k796b" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.522288 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-server-conf" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.526031 4765 scope.go:117] "RemoveContainer" containerID="da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.526302 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.527181 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-config-data" Mar 19 10:46:42 crc kubenswrapper[4765]: E0319 10:46:42.528839 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9\": container with ID starting with da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9 not found: ID does not exist" containerID="da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.528872 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9"} err="failed to get container status \"da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9\": rpc error: code = NotFound desc = could not find container \"da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9\": container with ID starting with da10f80dff1f73e3d83ca18e932736d7d582457c3a6c02937b0e90ea18e0cec9 not found: ID does not exist" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.528895 4765 scope.go:117] "RemoveContainer" containerID="211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12" Mar 19 10:46:42 crc kubenswrapper[4765]: E0319 10:46:42.532110 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12\": container with ID starting with 211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12 not found: ID does not exist" containerID="211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.532147 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12"} err="failed to get container status \"211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12\": rpc error: code = NotFound desc = could not find container \"211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12\": container with ID starting with 211030e1e9b78c18cebc64fcaced7fde324341ce901ce54b5a805cbbc4f3db12 not found: ID does not exist" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.593757 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.593806 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53406a09-7bdd-4517-ac01-0823bce386bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.593831 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.593848 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.593999 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.594049 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.594104 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53406a09-7bdd-4517-ac01-0823bce386bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.594126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.594149 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.594168 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fvh5\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-kube-api-access-7fvh5\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.594186 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.695590 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53406a09-7bdd-4517-ac01-0823bce386bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.695645 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.695680 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.695860 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7fvh5\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-kube-api-access-7fvh5\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.695895 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.695941 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.695999 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53406a09-7bdd-4517-ac01-0823bce386bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.696025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.696046 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.696153 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.696218 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.696328 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.696439 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") device mount path \"/mnt/openstack/pv11\"" pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.696787 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.699004 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-config-data\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.699664 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.700674 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/53406a09-7bdd-4517-ac01-0823bce386bc-server-conf\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.704403 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-tls\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.704418 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/53406a09-7bdd-4517-ac01-0823bce386bc-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.704821 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/53406a09-7bdd-4517-ac01-0823bce386bc-pod-info\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.712499 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.722665 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fvh5\" (UniqueName: \"kubernetes.io/projected/53406a09-7bdd-4517-ac01-0823bce386bc-kube-api-access-7fvh5\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.738194 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"rabbitmq-server-0\" (UID: \"53406a09-7bdd-4517-ac01-0823bce386bc\") " pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.797137 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.875148 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-server-0" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899041 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-erlang-cookie\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899120 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-tls\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899150 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-plugins-conf\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899177 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-server-conf\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899226 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-confd\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899249 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ccdb0a31-8b87-4024-848f-efebcf46e604-erlang-cookie-secret\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899308 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-config-data\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899346 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcfsp\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-kube-api-access-jcfsp\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899431 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899487 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-plugins\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.899530 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ccdb0a31-8b87-4024-848f-efebcf46e604-pod-info\") pod \"ccdb0a31-8b87-4024-848f-efebcf46e604\" (UID: \"ccdb0a31-8b87-4024-848f-efebcf46e604\") " Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.905411 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.909075 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/ccdb0a31-8b87-4024-848f-efebcf46e604-pod-info" (OuterVolumeSpecName: "pod-info") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.912455 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.916284 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ccdb0a31-8b87-4024-848f-efebcf46e604-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.918514 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "persistence") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.923163 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-tls" (OuterVolumeSpecName: "rabbitmq-tls") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "rabbitmq-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.923170 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.926877 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-kube-api-access-jcfsp" (OuterVolumeSpecName: "kube-api-access-jcfsp") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "kube-api-access-jcfsp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:42 crc kubenswrapper[4765]: I0319 10:46:42.950090 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-config-data" (OuterVolumeSpecName: "config-data") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:42.999224 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-d558885bc-r5tvs"] Mar 19 10:46:43 crc kubenswrapper[4765]: E0319 10:46:42.999696 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerName="rabbitmq" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:42.999713 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerName="rabbitmq" Mar 19 10:46:43 crc kubenswrapper[4765]: E0319 10:46:42.999729 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerName="setup-container" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:42.999737 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerName="setup-container" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:42.999930 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" containerName="rabbitmq" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.000909 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001538 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001583 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001600 4765 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/ccdb0a31-8b87-4024-848f-efebcf46e604-pod-info\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001611 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001624 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-tls\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001636 4765 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-plugins-conf\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001646 4765 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/ccdb0a31-8b87-4024-848f-efebcf46e604-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001656 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.001666 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcfsp\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-kube-api-access-jcfsp\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.005634 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-edpm-ipam" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.024203 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-server-conf" (OuterVolumeSpecName: "server-conf") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "server-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.035776 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-r5tvs"] Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.054566 4765 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.084089 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "ccdb0a31-8b87-4024-848f-efebcf46e604" (UID: "ccdb0a31-8b87-4024-848f-efebcf46e604"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.103653 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.103718 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.103745 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-config\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.103765 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.103825 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-svc\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.103943 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jm94\" (UniqueName: \"kubernetes.io/projected/a386293e-4f14-42af-824d-cbe47d3680b9-kube-api-access-9jm94\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.104172 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.104353 4765 reconciler_common.go:293] "Volume detached for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/ccdb0a31-8b87-4024-848f-efebcf46e604-server-conf\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.104368 4765 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/ccdb0a31-8b87-4024-848f-efebcf46e604-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.104381 4765 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.206294 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.206820 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.206894 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-config\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.206927 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.206949 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.207015 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-svc\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.207069 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9jm94\" (UniqueName: \"kubernetes.io/projected/a386293e-4f14-42af-824d-cbe47d3680b9-kube-api-access-9jm94\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.207498 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-swift-storage-0\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.208417 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-openstack-edpm-ipam\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.208468 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-config\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.209186 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-svc\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.209623 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-nb\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.210107 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-sb\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.226953 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jm94\" (UniqueName: \"kubernetes.io/projected/a386293e-4f14-42af-824d-cbe47d3680b9-kube-api-access-9jm94\") pod \"dnsmasq-dns-d558885bc-r5tvs\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.360710 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.472302 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"ccdb0a31-8b87-4024-848f-efebcf46e604","Type":"ContainerDied","Data":"2c27c17a27d18f09dca325f0e2571f97508fb5de9b46463c1871c2b66c4e5bcc"} Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.472347 4765 scope.go:117] "RemoveContainer" containerID="ac4ca5b206e9277cd39173f5dfd422276689843a7fc0ce7c0e0c3a65f43ef637" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.472438 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.522569 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-server-0"] Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.570372 4765 scope.go:117] "RemoveContainer" containerID="ede7c513464d77c5408931eb19bc0afaf30f0f27c91691fc636041f35a22d68b" Mar 19 10:46:43 crc kubenswrapper[4765]: W0319 10:46:43.591033 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod53406a09_7bdd_4517_ac01_0823bce386bc.slice/crio-9a744f1a8ff26fa91c782130415eff74acc00b393f1180a91817b771eee334c1 WatchSource:0}: Error finding container 9a744f1a8ff26fa91c782130415eff74acc00b393f1180a91817b771eee334c1: Status 404 returned error can't find the container with id 9a744f1a8ff26fa91c782130415eff74acc00b393f1180a91817b771eee334c1 Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.677425 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.694474 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.712650 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.718253 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.724213 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-plugins-conf" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.724417 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"cert-rabbitmq-cell1-svc" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.724477 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-server-dockercfg-spcmw" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.724228 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-config-data" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.724998 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-erlang-cookie" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.725191 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"rabbitmq-cell1-server-conf" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.729833 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"rabbitmq-cell1-default-user" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.761050 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:46:43 crc kubenswrapper[4765]: W0319 10:46:43.820904 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda386293e_4f14_42af_824d_cbe47d3680b9.slice/crio-5b5c87156c647c26d6457c69e8abcc5750d8473b4984ecf91a7c1ea49950bf7e WatchSource:0}: Error finding container 5b5c87156c647c26d6457c69e8abcc5750d8473b4984ecf91a7c1ea49950bf7e: Status 404 returned error can't find the container with id 5b5c87156c647c26d6457c69e8abcc5750d8473b4984ecf91a7c1ea49950bf7e Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.825460 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-r5tvs"] Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.843384 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.843680 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.843763 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3043d68-f6dc-4095-bc0e-62b2282dd297-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.843951 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.844026 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3043d68-f6dc-4095-bc0e-62b2282dd297-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.844074 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m8rqn\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-kube-api-access-m8rqn\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.844103 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.844199 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.844225 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.844257 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.844292 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.946473 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3043d68-f6dc-4095-bc0e-62b2282dd297-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.946815 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.946851 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3043d68-f6dc-4095-bc0e-62b2282dd297-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.946877 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m8rqn\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-kube-api-access-m8rqn\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.946898 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.946938 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.947032 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.947067 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.947088 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.947147 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.947178 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.948922 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-plugins\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.949185 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") device mount path \"/mnt/openstack/pv03\"" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.949528 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-plugins-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.949233 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-erlang-cookie\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.949855 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"server-conf\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-server-conf\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.950788 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/c3043d68-f6dc-4095-bc0e-62b2282dd297-config-data\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.952352 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/c3043d68-f6dc-4095-bc0e-62b2282dd297-erlang-cookie-secret\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.952895 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-tls\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-tls\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.955428 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/c3043d68-f6dc-4095-bc0e-62b2282dd297-pod-info\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.956013 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-rabbitmq-confd\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.965245 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m8rqn\" (UniqueName: \"kubernetes.io/projected/c3043d68-f6dc-4095-bc0e-62b2282dd297-kube-api-access-m8rqn\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:43 crc kubenswrapper[4765]: I0319 10:46:43.983918 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"rabbitmq-cell1-server-0\" (UID: \"c3043d68-f6dc-4095-bc0e-62b2282dd297\") " pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.123347 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.369751 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2190a046-0d52-49c7-b2fd-aa113c2f3f99" path="/var/lib/kubelet/pods/2190a046-0d52-49c7-b2fd-aa113c2f3f99/volumes" Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.372901 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ccdb0a31-8b87-4024-848f-efebcf46e604" path="/var/lib/kubelet/pods/ccdb0a31-8b87-4024-848f-efebcf46e604/volumes" Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.493140 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53406a09-7bdd-4517-ac01-0823bce386bc","Type":"ContainerStarted","Data":"9a744f1a8ff26fa91c782130415eff74acc00b393f1180a91817b771eee334c1"} Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.495380 4765 generic.go:334] "Generic (PLEG): container finished" podID="a386293e-4f14-42af-824d-cbe47d3680b9" containerID="ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f" exitCode=0 Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.495453 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" event={"ID":"a386293e-4f14-42af-824d-cbe47d3680b9","Type":"ContainerDied","Data":"ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f"} Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.495479 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" event={"ID":"a386293e-4f14-42af-824d-cbe47d3680b9","Type":"ContainerStarted","Data":"5b5c87156c647c26d6457c69e8abcc5750d8473b4984ecf91a7c1ea49950bf7e"} Mar 19 10:46:44 crc kubenswrapper[4765]: I0319 10:46:44.623119 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/rabbitmq-cell1-server-0"] Mar 19 10:46:44 crc kubenswrapper[4765]: W0319 10:46:44.624231 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3043d68_f6dc_4095_bc0e_62b2282dd297.slice/crio-ba6e7bc47bfe05e8a1619698bc25c3916344c8d1c3fdd35f508734a122ce6f00 WatchSource:0}: Error finding container ba6e7bc47bfe05e8a1619698bc25c3916344c8d1c3fdd35f508734a122ce6f00: Status 404 returned error can't find the container with id ba6e7bc47bfe05e8a1619698bc25c3916344c8d1c3fdd35f508734a122ce6f00 Mar 19 10:46:44 crc kubenswrapper[4765]: E0319 10:46:44.842442 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice/crio-df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice\": RecentStats: unable to find data in memory cache]" Mar 19 10:46:45 crc kubenswrapper[4765]: I0319 10:46:45.509869 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" event={"ID":"a386293e-4f14-42af-824d-cbe47d3680b9","Type":"ContainerStarted","Data":"e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6"} Mar 19 10:46:45 crc kubenswrapper[4765]: I0319 10:46:45.510248 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:45 crc kubenswrapper[4765]: I0319 10:46:45.512704 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3043d68-f6dc-4095-bc0e-62b2282dd297","Type":"ContainerStarted","Data":"ba6e7bc47bfe05e8a1619698bc25c3916344c8d1c3fdd35f508734a122ce6f00"} Mar 19 10:46:45 crc kubenswrapper[4765]: I0319 10:46:45.531647 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" podStartSLOduration=3.531627409 podStartE2EDuration="3.531627409s" podCreationTimestamp="2026-03-19 10:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:46:45.524109414 +0000 UTC m=+1503.873054976" watchObservedRunningTime="2026-03-19 10:46:45.531627409 +0000 UTC m=+1503.880572951" Mar 19 10:46:46 crc kubenswrapper[4765]: I0319 10:46:46.522469 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53406a09-7bdd-4517-ac01-0823bce386bc","Type":"ContainerStarted","Data":"4e0cfb1cf40da954d5de612f7bc6f4ab3700e0aaa0924cbeb7e16a3b91046b3a"} Mar 19 10:46:46 crc kubenswrapper[4765]: I0319 10:46:46.524119 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3043d68-f6dc-4095-bc0e-62b2282dd297","Type":"ContainerStarted","Data":"a70129a543ce35722d209e43cec6e98b05648ad0f47d4281a61ffc2aa1a2ad27"} Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.364189 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.501424 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bq6nt"] Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.502273 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" podUID="949ced49-5178-4806-a521-3b46431783ba" containerName="dnsmasq-dns" containerID="cri-o://dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6" gracePeriod=10 Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.658359 4765 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" podUID="949ced49-5178-4806-a521-3b46431783ba" containerName="dnsmasq-dns" probeResult="failure" output="dial tcp 10.217.0.205:5353: connect: connection refused" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.766970 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-t5pnc"] Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.769123 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.796012 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-t5pnc"] Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.838284 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.838370 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.838443 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-config\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.838615 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-86wsz\" (UniqueName: \"kubernetes.io/projected/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-kube-api-access-86wsz\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.838742 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.838922 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.838973 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.949354 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.949433 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-config\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.949525 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-86wsz\" (UniqueName: \"kubernetes.io/projected/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-kube-api-access-86wsz\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.949598 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.949692 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.949718 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.949768 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.950328 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-openstack-edpm-ipam\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.950665 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-ovsdbserver-sb\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.950771 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-ovsdbserver-nb\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.950894 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-dns-svc\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.951288 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-config\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.951800 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-dns-swift-storage-0\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:53 crc kubenswrapper[4765]: I0319 10:46:53.970670 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-86wsz\" (UniqueName: \"kubernetes.io/projected/8051ab1f-9c27-4c1b-b9f9-9f883c67bea9-kube-api-access-86wsz\") pod \"dnsmasq-dns-78c64bc9c5-t5pnc\" (UID: \"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9\") " pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.092851 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.198021 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.258085 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-nb\") pod \"949ced49-5178-4806-a521-3b46431783ba\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.258167 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ljx4w\" (UniqueName: \"kubernetes.io/projected/949ced49-5178-4806-a521-3b46431783ba-kube-api-access-ljx4w\") pod \"949ced49-5178-4806-a521-3b46431783ba\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.258188 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-svc\") pod \"949ced49-5178-4806-a521-3b46431783ba\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.258242 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-config\") pod \"949ced49-5178-4806-a521-3b46431783ba\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.258353 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-swift-storage-0\") pod \"949ced49-5178-4806-a521-3b46431783ba\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.258424 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-sb\") pod \"949ced49-5178-4806-a521-3b46431783ba\" (UID: \"949ced49-5178-4806-a521-3b46431783ba\") " Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.271198 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949ced49-5178-4806-a521-3b46431783ba-kube-api-access-ljx4w" (OuterVolumeSpecName: "kube-api-access-ljx4w") pod "949ced49-5178-4806-a521-3b46431783ba" (UID: "949ced49-5178-4806-a521-3b46431783ba"). InnerVolumeSpecName "kube-api-access-ljx4w". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.325544 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "949ced49-5178-4806-a521-3b46431783ba" (UID: "949ced49-5178-4806-a521-3b46431783ba"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.338446 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "949ced49-5178-4806-a521-3b46431783ba" (UID: "949ced49-5178-4806-a521-3b46431783ba"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.338552 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "949ced49-5178-4806-a521-3b46431783ba" (UID: "949ced49-5178-4806-a521-3b46431783ba"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.366686 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "949ced49-5178-4806-a521-3b46431783ba" (UID: "949ced49-5178-4806-a521-3b46431783ba"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.368745 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.368790 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.368804 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.368815 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ljx4w\" (UniqueName: \"kubernetes.io/projected/949ced49-5178-4806-a521-3b46431783ba-kube-api-access-ljx4w\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.368828 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.391864 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-config" (OuterVolumeSpecName: "config") pod "949ced49-5178-4806-a521-3b46431783ba" (UID: "949ced49-5178-4806-a521-3b46431783ba"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.470616 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/949ced49-5178-4806-a521-3b46431783ba-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.593899 4765 generic.go:334] "Generic (PLEG): container finished" podID="949ced49-5178-4806-a521-3b46431783ba" containerID="dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6" exitCode=0 Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.593951 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" event={"ID":"949ced49-5178-4806-a521-3b46431783ba","Type":"ContainerDied","Data":"dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6"} Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.594006 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" event={"ID":"949ced49-5178-4806-a521-3b46431783ba","Type":"ContainerDied","Data":"d1bd3b56ab09c6f8b3f781e9ac52ab8b319a0778e6691f4f010f31d1c9797135"} Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.594028 4765 scope.go:117] "RemoveContainer" containerID="dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.594066 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-cd5cbd7b9-bq6nt" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.615917 4765 scope.go:117] "RemoveContainer" containerID="91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.621004 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78c64bc9c5-t5pnc"] Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.631149 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bq6nt"] Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.641520 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-cd5cbd7b9-bq6nt"] Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.642158 4765 scope.go:117] "RemoveContainer" containerID="dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6" Mar 19 10:46:54 crc kubenswrapper[4765]: E0319 10:46:54.645039 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6\": container with ID starting with dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6 not found: ID does not exist" containerID="dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.645291 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6"} err="failed to get container status \"dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6\": rpc error: code = NotFound desc = could not find container \"dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6\": container with ID starting with dbcae6d1d68b9a447d0c88f93d5423fdfa7b348d6a27d9ef6bc11d14abf9c5e6 not found: ID does not exist" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.645394 4765 scope.go:117] "RemoveContainer" containerID="91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4" Mar 19 10:46:54 crc kubenswrapper[4765]: E0319 10:46:54.645728 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4\": container with ID starting with 91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4 not found: ID does not exist" containerID="91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4" Mar 19 10:46:54 crc kubenswrapper[4765]: I0319 10:46:54.645754 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4"} err="failed to get container status \"91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4\": rpc error: code = NotFound desc = could not find container \"91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4\": container with ID starting with 91a1835a424bee47aa55c74b4b7ef3089e8509ed1b3e0d472d1367711a8e4cc4 not found: ID does not exist" Mar 19 10:46:55 crc kubenswrapper[4765]: E0319 10:46:55.094372 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice/crio-df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8051ab1f_9c27_4c1b_b9f9_9f883c67bea9.slice/crio-82bd7c6afb79b896fa9eb2196b836321eba4b08018a7b85fc43bc9838e7a8ade.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice\": RecentStats: unable to find data in memory cache]" Mar 19 10:46:55 crc kubenswrapper[4765]: I0319 10:46:55.605570 4765 generic.go:334] "Generic (PLEG): container finished" podID="8051ab1f-9c27-4c1b-b9f9-9f883c67bea9" containerID="82bd7c6afb79b896fa9eb2196b836321eba4b08018a7b85fc43bc9838e7a8ade" exitCode=0 Mar 19 10:46:55 crc kubenswrapper[4765]: I0319 10:46:55.605608 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" event={"ID":"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9","Type":"ContainerDied","Data":"82bd7c6afb79b896fa9eb2196b836321eba4b08018a7b85fc43bc9838e7a8ade"} Mar 19 10:46:55 crc kubenswrapper[4765]: I0319 10:46:55.605630 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" event={"ID":"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9","Type":"ContainerStarted","Data":"9a76262f84e7f7c3819ea4cc6a940fbcbc7024e1359c75f82713a0122744bf82"} Mar 19 10:46:56 crc kubenswrapper[4765]: I0319 10:46:56.367055 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949ced49-5178-4806-a521-3b46431783ba" path="/var/lib/kubelet/pods/949ced49-5178-4806-a521-3b46431783ba/volumes" Mar 19 10:46:56 crc kubenswrapper[4765]: I0319 10:46:56.616499 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" event={"ID":"8051ab1f-9c27-4c1b-b9f9-9f883c67bea9","Type":"ContainerStarted","Data":"308aee0191aab5d0b29c23692eac4ee2dc723c4d305a9e62bb448c1a22296777"} Mar 19 10:46:56 crc kubenswrapper[4765]: I0319 10:46:56.617632 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:46:56 crc kubenswrapper[4765]: I0319 10:46:56.643661 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" podStartSLOduration=3.6436385959999997 podStartE2EDuration="3.643638596s" podCreationTimestamp="2026-03-19 10:46:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:46:56.634334153 +0000 UTC m=+1514.983279715" watchObservedRunningTime="2026-03-19 10:46:56.643638596 +0000 UTC m=+1514.992584138" Mar 19 10:46:57 crc kubenswrapper[4765]: I0319 10:46:57.581301 4765 scope.go:117] "RemoveContainer" containerID="1dbfa468f3f14bdc0b4bd4795787e07bcfc06b131f93eb6364d7b257f6bd081e" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.095165 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78c64bc9c5-t5pnc" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.163923 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-r5tvs"] Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.165035 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" podUID="a386293e-4f14-42af-824d-cbe47d3680b9" containerName="dnsmasq-dns" containerID="cri-o://e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6" gracePeriod=10 Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.691721 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.700572 4765 generic.go:334] "Generic (PLEG): container finished" podID="a386293e-4f14-42af-824d-cbe47d3680b9" containerID="e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6" exitCode=0 Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.700629 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" event={"ID":"a386293e-4f14-42af-824d-cbe47d3680b9","Type":"ContainerDied","Data":"e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6"} Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.700664 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" event={"ID":"a386293e-4f14-42af-824d-cbe47d3680b9","Type":"ContainerDied","Data":"5b5c87156c647c26d6457c69e8abcc5750d8473b4984ecf91a7c1ea49950bf7e"} Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.700684 4765 scope.go:117] "RemoveContainer" containerID="e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.700827 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-d558885bc-r5tvs" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.738478 4765 scope.go:117] "RemoveContainer" containerID="ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.758862 4765 scope.go:117] "RemoveContainer" containerID="e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6" Mar 19 10:47:04 crc kubenswrapper[4765]: E0319 10:47:04.759430 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6\": container with ID starting with e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6 not found: ID does not exist" containerID="e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.759478 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6"} err="failed to get container status \"e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6\": rpc error: code = NotFound desc = could not find container \"e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6\": container with ID starting with e58c7f83219ea4f6c7341ba895368047a283eff311094d22871e278dea8d2df6 not found: ID does not exist" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.759503 4765 scope.go:117] "RemoveContainer" containerID="ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f" Mar 19 10:47:04 crc kubenswrapper[4765]: E0319 10:47:04.759744 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f\": container with ID starting with ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f not found: ID does not exist" containerID="ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.759791 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f"} err="failed to get container status \"ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f\": rpc error: code = NotFound desc = could not find container \"ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f\": container with ID starting with ed9e3aa67877a300dbc8ace167893b01a53af04bb408658feef27319888db80f not found: ID does not exist" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.779627 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-config\") pod \"a386293e-4f14-42af-824d-cbe47d3680b9\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.779706 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-svc\") pod \"a386293e-4f14-42af-824d-cbe47d3680b9\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.779761 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-swift-storage-0\") pod \"a386293e-4f14-42af-824d-cbe47d3680b9\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.779927 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-openstack-edpm-ipam\") pod \"a386293e-4f14-42af-824d-cbe47d3680b9\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.779995 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-sb\") pod \"a386293e-4f14-42af-824d-cbe47d3680b9\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.780045 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jm94\" (UniqueName: \"kubernetes.io/projected/a386293e-4f14-42af-824d-cbe47d3680b9-kube-api-access-9jm94\") pod \"a386293e-4f14-42af-824d-cbe47d3680b9\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.780085 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-nb\") pod \"a386293e-4f14-42af-824d-cbe47d3680b9\" (UID: \"a386293e-4f14-42af-824d-cbe47d3680b9\") " Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.789668 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a386293e-4f14-42af-824d-cbe47d3680b9-kube-api-access-9jm94" (OuterVolumeSpecName: "kube-api-access-9jm94") pod "a386293e-4f14-42af-824d-cbe47d3680b9" (UID: "a386293e-4f14-42af-824d-cbe47d3680b9"). InnerVolumeSpecName "kube-api-access-9jm94". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.831032 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-swift-storage-0" (OuterVolumeSpecName: "dns-swift-storage-0") pod "a386293e-4f14-42af-824d-cbe47d3680b9" (UID: "a386293e-4f14-42af-824d-cbe47d3680b9"). InnerVolumeSpecName "dns-swift-storage-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.834341 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-config" (OuterVolumeSpecName: "config") pod "a386293e-4f14-42af-824d-cbe47d3680b9" (UID: "a386293e-4f14-42af-824d-cbe47d3680b9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.835022 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-openstack-edpm-ipam" (OuterVolumeSpecName: "openstack-edpm-ipam") pod "a386293e-4f14-42af-824d-cbe47d3680b9" (UID: "a386293e-4f14-42af-824d-cbe47d3680b9"). InnerVolumeSpecName "openstack-edpm-ipam". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.836766 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-nb" (OuterVolumeSpecName: "ovsdbserver-nb") pod "a386293e-4f14-42af-824d-cbe47d3680b9" (UID: "a386293e-4f14-42af-824d-cbe47d3680b9"). InnerVolumeSpecName "ovsdbserver-nb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.841812 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-svc" (OuterVolumeSpecName: "dns-svc") pod "a386293e-4f14-42af-824d-cbe47d3680b9" (UID: "a386293e-4f14-42af-824d-cbe47d3680b9"). InnerVolumeSpecName "dns-svc". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.853272 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-sb" (OuterVolumeSpecName: "ovsdbserver-sb") pod "a386293e-4f14-42af-824d-cbe47d3680b9" (UID: "a386293e-4f14-42af-824d-cbe47d3680b9"). InnerVolumeSpecName "ovsdbserver-sb". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.882280 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9jm94\" (UniqueName: \"kubernetes.io/projected/a386293e-4f14-42af-824d-cbe47d3680b9-kube-api-access-9jm94\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.882364 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-nb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-nb\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.882380 4765 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-config\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.882394 4765 reconciler_common.go:293] "Volume detached for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-svc\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.882406 4765 reconciler_common.go:293] "Volume detached for volume \"dns-swift-storage-0\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-dns-swift-storage-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.882419 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:04 crc kubenswrapper[4765]: I0319 10:47:04.882431 4765 reconciler_common.go:293] "Volume detached for volume \"ovsdbserver-sb\" (UniqueName: \"kubernetes.io/configmap/a386293e-4f14-42af-824d-cbe47d3680b9-ovsdbserver-sb\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:05 crc kubenswrapper[4765]: I0319 10:47:05.038947 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-r5tvs"] Mar 19 10:47:05 crc kubenswrapper[4765]: I0319 10:47:05.047448 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-d558885bc-r5tvs"] Mar 19 10:47:05 crc kubenswrapper[4765]: E0319 10:47:05.358793 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice/crio-df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice\": RecentStats: unable to find data in memory cache]" Mar 19 10:47:06 crc kubenswrapper[4765]: I0319 10:47:06.366644 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a386293e-4f14-42af-824d-cbe47d3680b9" path="/var/lib/kubelet/pods/a386293e-4f14-42af-824d-cbe47d3680b9/volumes" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.956141 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8"] Mar 19 10:47:12 crc kubenswrapper[4765]: E0319 10:47:12.958421 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949ced49-5178-4806-a521-3b46431783ba" containerName="dnsmasq-dns" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.958575 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="949ced49-5178-4806-a521-3b46431783ba" containerName="dnsmasq-dns" Mar 19 10:47:12 crc kubenswrapper[4765]: E0319 10:47:12.965321 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a386293e-4f14-42af-824d-cbe47d3680b9" containerName="dnsmasq-dns" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.965363 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a386293e-4f14-42af-824d-cbe47d3680b9" containerName="dnsmasq-dns" Mar 19 10:47:12 crc kubenswrapper[4765]: E0319 10:47:12.965377 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a386293e-4f14-42af-824d-cbe47d3680b9" containerName="init" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.965385 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a386293e-4f14-42af-824d-cbe47d3680b9" containerName="init" Mar 19 10:47:12 crc kubenswrapper[4765]: E0319 10:47:12.965406 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949ced49-5178-4806-a521-3b46431783ba" containerName="init" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.965415 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="949ced49-5178-4806-a521-3b46431783ba" containerName="init" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.965788 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a386293e-4f14-42af-824d-cbe47d3680b9" containerName="dnsmasq-dns" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.965820 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="949ced49-5178-4806-a521-3b46431783ba" containerName="dnsmasq-dns" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.966492 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.969549 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8"] Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.969560 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.969576 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.969943 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:47:12 crc kubenswrapper[4765]: I0319 10:47:12.970077 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.038169 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cp2t5\" (UniqueName: \"kubernetes.io/projected/ab6c9186-ce11-4085-9c4c-c0964cb170d8-kube-api-access-cp2t5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.038424 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.038659 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.038845 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.141335 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.141444 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.141521 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cp2t5\" (UniqueName: \"kubernetes.io/projected/ab6c9186-ce11-4085-9c4c-c0964cb170d8-kube-api-access-cp2t5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.141550 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.149839 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-ssh-key-openstack-edpm-ipam\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.150071 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-repo-setup-combined-ca-bundle\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.150380 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-inventory\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.162108 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cp2t5\" (UniqueName: \"kubernetes.io/projected/ab6c9186-ce11-4085-9c4c-c0964cb170d8-kube-api-access-cp2t5\") pod \"repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.287674 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:13 crc kubenswrapper[4765]: I0319 10:47:13.829450 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8"] Mar 19 10:47:13 crc kubenswrapper[4765]: W0319 10:47:13.835061 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podab6c9186_ce11_4085_9c4c_c0964cb170d8.slice/crio-5e291073ac3521159f24f9fb0a803e747d3b972f06bc25218cee8b5e9689cc98 WatchSource:0}: Error finding container 5e291073ac3521159f24f9fb0a803e747d3b972f06bc25218cee8b5e9689cc98: Status 404 returned error can't find the container with id 5e291073ac3521159f24f9fb0a803e747d3b972f06bc25218cee8b5e9689cc98 Mar 19 10:47:14 crc kubenswrapper[4765]: I0319 10:47:14.788915 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" event={"ID":"ab6c9186-ce11-4085-9c4c-c0964cb170d8","Type":"ContainerStarted","Data":"5e291073ac3521159f24f9fb0a803e747d3b972f06bc25218cee8b5e9689cc98"} Mar 19 10:47:15 crc kubenswrapper[4765]: E0319 10:47:15.602036 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice/crio-df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d\": RecentStats: unable to find data in memory cache]" Mar 19 10:47:17 crc kubenswrapper[4765]: I0319 10:47:17.818894 4765 generic.go:334] "Generic (PLEG): container finished" podID="53406a09-7bdd-4517-ac01-0823bce386bc" containerID="4e0cfb1cf40da954d5de612f7bc6f4ab3700e0aaa0924cbeb7e16a3b91046b3a" exitCode=0 Mar 19 10:47:17 crc kubenswrapper[4765]: I0319 10:47:17.818916 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53406a09-7bdd-4517-ac01-0823bce386bc","Type":"ContainerDied","Data":"4e0cfb1cf40da954d5de612f7bc6f4ab3700e0aaa0924cbeb7e16a3b91046b3a"} Mar 19 10:47:18 crc kubenswrapper[4765]: I0319 10:47:18.830506 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-server-0" event={"ID":"53406a09-7bdd-4517-ac01-0823bce386bc","Type":"ContainerStarted","Data":"7c90a544dd7e15a561bab8befc4462aa058baf648e32758677d9ed076452f49e"} Mar 19 10:47:18 crc kubenswrapper[4765]: I0319 10:47:18.831651 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-server-0" Mar 19 10:47:18 crc kubenswrapper[4765]: I0319 10:47:18.832719 4765 generic.go:334] "Generic (PLEG): container finished" podID="c3043d68-f6dc-4095-bc0e-62b2282dd297" containerID="a70129a543ce35722d209e43cec6e98b05648ad0f47d4281a61ffc2aa1a2ad27" exitCode=0 Mar 19 10:47:18 crc kubenswrapper[4765]: I0319 10:47:18.832755 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3043d68-f6dc-4095-bc0e-62b2282dd297","Type":"ContainerDied","Data":"a70129a543ce35722d209e43cec6e98b05648ad0f47d4281a61ffc2aa1a2ad27"} Mar 19 10:47:18 crc kubenswrapper[4765]: I0319 10:47:18.860182 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-server-0" podStartSLOduration=36.86016788 podStartE2EDuration="36.86016788s" podCreationTimestamp="2026-03-19 10:46:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:47:18.854820764 +0000 UTC m=+1537.203766306" watchObservedRunningTime="2026-03-19 10:47:18.86016788 +0000 UTC m=+1537.209113422" Mar 19 10:47:24 crc kubenswrapper[4765]: I0319 10:47:24.892723 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" event={"ID":"ab6c9186-ce11-4085-9c4c-c0964cb170d8","Type":"ContainerStarted","Data":"d9b3b542c2d9df22e6dce455d21944ab1ac942b61e2d3e00a3ec932749b5734d"} Mar 19 10:47:24 crc kubenswrapper[4765]: I0319 10:47:24.895625 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/rabbitmq-cell1-server-0" event={"ID":"c3043d68-f6dc-4095-bc0e-62b2282dd297","Type":"ContainerStarted","Data":"26cb6af006a7495700144d59df15ab87dda7f4b1bc1cf685e7004cd7b65e6417"} Mar 19 10:47:24 crc kubenswrapper[4765]: I0319 10:47:24.895847 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:47:24 crc kubenswrapper[4765]: I0319 10:47:24.916638 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" podStartSLOduration=2.447442488 podStartE2EDuration="12.916620848s" podCreationTimestamp="2026-03-19 10:47:12 +0000 UTC" firstStartedPulling="2026-03-19 10:47:13.839165593 +0000 UTC m=+1532.188111135" lastFinishedPulling="2026-03-19 10:47:24.308343953 +0000 UTC m=+1542.657289495" observedRunningTime="2026-03-19 10:47:24.914421548 +0000 UTC m=+1543.263367090" watchObservedRunningTime="2026-03-19 10:47:24.916620848 +0000 UTC m=+1543.265566390" Mar 19 10:47:24 crc kubenswrapper[4765]: I0319 10:47:24.943640 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/rabbitmq-cell1-server-0" podStartSLOduration=41.943612154 podStartE2EDuration="41.943612154s" podCreationTimestamp="2026-03-19 10:46:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 10:47:24.932213993 +0000 UTC m=+1543.281159535" watchObservedRunningTime="2026-03-19 10:47:24.943612154 +0000 UTC m=+1543.292557696" Mar 19 10:47:25 crc kubenswrapper[4765]: E0319 10:47:25.842872 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice/crio-df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d\": RecentStats: unable to find data in memory cache]" Mar 19 10:47:32 crc kubenswrapper[4765]: I0319 10:47:32.878167 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-server-0" Mar 19 10:47:34 crc kubenswrapper[4765]: I0319 10:47:34.129165 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/rabbitmq-cell1-server-0" Mar 19 10:47:35 crc kubenswrapper[4765]: I0319 10:47:35.994436 4765 generic.go:334] "Generic (PLEG): container finished" podID="ab6c9186-ce11-4085-9c4c-c0964cb170d8" containerID="d9b3b542c2d9df22e6dce455d21944ab1ac942b61e2d3e00a3ec932749b5734d" exitCode=0 Mar 19 10:47:35 crc kubenswrapper[4765]: I0319 10:47:35.994524 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" event={"ID":"ab6c9186-ce11-4085-9c4c-c0964cb170d8","Type":"ContainerDied","Data":"d9b3b542c2d9df22e6dce455d21944ab1ac942b61e2d3e00a3ec932749b5734d"} Mar 19 10:47:36 crc kubenswrapper[4765]: E0319 10:47:36.098999 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2190a046_0d52_49c7_b2fd_aa113c2f3f99.slice/crio-df30f2ef76337c66293524a48dc1dc49221fdfb46718508dbd906c418420cf0d\": RecentStats: unable to find data in memory cache]" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.455286 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.562677 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-ssh-key-openstack-edpm-ipam\") pod \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.562826 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-inventory\") pod \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.562859 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-repo-setup-combined-ca-bundle\") pod \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.563020 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cp2t5\" (UniqueName: \"kubernetes.io/projected/ab6c9186-ce11-4085-9c4c-c0964cb170d8-kube-api-access-cp2t5\") pod \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\" (UID: \"ab6c9186-ce11-4085-9c4c-c0964cb170d8\") " Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.568795 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ab6c9186-ce11-4085-9c4c-c0964cb170d8" (UID: "ab6c9186-ce11-4085-9c4c-c0964cb170d8"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.570910 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab6c9186-ce11-4085-9c4c-c0964cb170d8-kube-api-access-cp2t5" (OuterVolumeSpecName: "kube-api-access-cp2t5") pod "ab6c9186-ce11-4085-9c4c-c0964cb170d8" (UID: "ab6c9186-ce11-4085-9c4c-c0964cb170d8"). InnerVolumeSpecName "kube-api-access-cp2t5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.596326 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ab6c9186-ce11-4085-9c4c-c0964cb170d8" (UID: "ab6c9186-ce11-4085-9c4c-c0964cb170d8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.597539 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-inventory" (OuterVolumeSpecName: "inventory") pod "ab6c9186-ce11-4085-9c4c-c0964cb170d8" (UID: "ab6c9186-ce11-4085-9c4c-c0964cb170d8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.669113 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.669190 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.669205 4765 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ab6c9186-ce11-4085-9c4c-c0964cb170d8-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:37 crc kubenswrapper[4765]: I0319 10:47:37.669224 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cp2t5\" (UniqueName: \"kubernetes.io/projected/ab6c9186-ce11-4085-9c4c-c0964cb170d8-kube-api-access-cp2t5\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.010908 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" event={"ID":"ab6c9186-ce11-4085-9c4c-c0964cb170d8","Type":"ContainerDied","Data":"5e291073ac3521159f24f9fb0a803e747d3b972f06bc25218cee8b5e9689cc98"} Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.011318 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e291073ac3521159f24f9fb0a803e747d3b972f06bc25218cee8b5e9689cc98" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.010993 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.095579 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz"] Mar 19 10:47:38 crc kubenswrapper[4765]: E0319 10:47:38.096056 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ab6c9186-ce11-4085-9c4c-c0964cb170d8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.096080 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab6c9186-ce11-4085-9c4c-c0964cb170d8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.096327 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ab6c9186-ce11-4085-9c4c-c0964cb170d8" containerName="repo-setup-edpm-deployment-openstack-edpm-ipam" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.097112 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.101010 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.101277 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.101423 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.103311 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.111577 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz"] Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.278491 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.278620 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.279059 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfvdz\" (UniqueName: \"kubernetes.io/projected/af5d1fcd-a500-4d64-a86a-37cae82350d3-kube-api-access-mfvdz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.380865 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mfvdz\" (UniqueName: \"kubernetes.io/projected/af5d1fcd-a500-4d64-a86a-37cae82350d3-kube-api-access-mfvdz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.380917 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.381006 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.386016 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-ssh-key-openstack-edpm-ipam\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.386531 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-inventory\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.396804 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfvdz\" (UniqueName: \"kubernetes.io/projected/af5d1fcd-a500-4d64-a86a-37cae82350d3-kube-api-access-mfvdz\") pod \"redhat-edpm-deployment-openstack-edpm-ipam-zsxdz\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.416103 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.941905 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz"] Mar 19 10:47:38 crc kubenswrapper[4765]: I0319 10:47:38.946006 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:47:39 crc kubenswrapper[4765]: I0319 10:47:39.021005 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" event={"ID":"af5d1fcd-a500-4d64-a86a-37cae82350d3","Type":"ContainerStarted","Data":"82a525a15a6a803b1280fa2eecba44afc51946f770d955ef8a5afb714fc184aa"} Mar 19 10:47:40 crc kubenswrapper[4765]: I0319 10:47:40.032740 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" event={"ID":"af5d1fcd-a500-4d64-a86a-37cae82350d3","Type":"ContainerStarted","Data":"9fdeb6c86ed909af47bd0d344f69de9da9d43934bfd8e3ef7866eb301d2e5399"} Mar 19 10:47:40 crc kubenswrapper[4765]: I0319 10:47:40.051245 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" podStartSLOduration=1.596530048 podStartE2EDuration="2.051226145s" podCreationTimestamp="2026-03-19 10:47:38 +0000 UTC" firstStartedPulling="2026-03-19 10:47:38.945756906 +0000 UTC m=+1557.294702448" lastFinishedPulling="2026-03-19 10:47:39.400453003 +0000 UTC m=+1557.749398545" observedRunningTime="2026-03-19 10:47:40.048871251 +0000 UTC m=+1558.397816793" watchObservedRunningTime="2026-03-19 10:47:40.051226145 +0000 UTC m=+1558.400171687" Mar 19 10:47:42 crc kubenswrapper[4765]: I0319 10:47:42.052437 4765 generic.go:334] "Generic (PLEG): container finished" podID="af5d1fcd-a500-4d64-a86a-37cae82350d3" containerID="9fdeb6c86ed909af47bd0d344f69de9da9d43934bfd8e3ef7866eb301d2e5399" exitCode=0 Mar 19 10:47:42 crc kubenswrapper[4765]: I0319 10:47:42.052529 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" event={"ID":"af5d1fcd-a500-4d64-a86a-37cae82350d3","Type":"ContainerDied","Data":"9fdeb6c86ed909af47bd0d344f69de9da9d43934bfd8e3ef7866eb301d2e5399"} Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.477110 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.584449 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-inventory\") pod \"af5d1fcd-a500-4d64-a86a-37cae82350d3\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.584580 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mfvdz\" (UniqueName: \"kubernetes.io/projected/af5d1fcd-a500-4d64-a86a-37cae82350d3-kube-api-access-mfvdz\") pod \"af5d1fcd-a500-4d64-a86a-37cae82350d3\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.584605 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-ssh-key-openstack-edpm-ipam\") pod \"af5d1fcd-a500-4d64-a86a-37cae82350d3\" (UID: \"af5d1fcd-a500-4d64-a86a-37cae82350d3\") " Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.592561 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af5d1fcd-a500-4d64-a86a-37cae82350d3-kube-api-access-mfvdz" (OuterVolumeSpecName: "kube-api-access-mfvdz") pod "af5d1fcd-a500-4d64-a86a-37cae82350d3" (UID: "af5d1fcd-a500-4d64-a86a-37cae82350d3"). InnerVolumeSpecName "kube-api-access-mfvdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.615327 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-inventory" (OuterVolumeSpecName: "inventory") pod "af5d1fcd-a500-4d64-a86a-37cae82350d3" (UID: "af5d1fcd-a500-4d64-a86a-37cae82350d3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.617934 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "af5d1fcd-a500-4d64-a86a-37cae82350d3" (UID: "af5d1fcd-a500-4d64-a86a-37cae82350d3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.687453 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.687493 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mfvdz\" (UniqueName: \"kubernetes.io/projected/af5d1fcd-a500-4d64-a86a-37cae82350d3-kube-api-access-mfvdz\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:43 crc kubenswrapper[4765]: I0319 10:47:43.687508 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/af5d1fcd-a500-4d64-a86a-37cae82350d3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.068877 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" event={"ID":"af5d1fcd-a500-4d64-a86a-37cae82350d3","Type":"ContainerDied","Data":"82a525a15a6a803b1280fa2eecba44afc51946f770d955ef8a5afb714fc184aa"} Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.069099 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82a525a15a6a803b1280fa2eecba44afc51946f770d955ef8a5afb714fc184aa" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.068910 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/redhat-edpm-deployment-openstack-edpm-ipam-zsxdz" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.131762 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm"] Mar 19 10:47:44 crc kubenswrapper[4765]: E0319 10:47:44.132138 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af5d1fcd-a500-4d64-a86a-37cae82350d3" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.132156 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="af5d1fcd-a500-4d64-a86a-37cae82350d3" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.132361 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="af5d1fcd-a500-4d64-a86a-37cae82350d3" containerName="redhat-edpm-deployment-openstack-edpm-ipam" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.133027 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.136693 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.143827 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.144135 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.145542 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.149052 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm"] Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.301443 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.301617 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.301765 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.301850 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-brsd2\" (UniqueName: \"kubernetes.io/projected/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-kube-api-access-brsd2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.403870 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.404033 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.404123 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.404180 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-brsd2\" (UniqueName: \"kubernetes.io/projected/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-kube-api-access-brsd2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.407817 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-ssh-key-openstack-edpm-ipam\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.408040 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-inventory\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.409315 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-bootstrap-combined-ca-bundle\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.420450 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-brsd2\" (UniqueName: \"kubernetes.io/projected/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-kube-api-access-brsd2\") pod \"bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.452806 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:47:44 crc kubenswrapper[4765]: W0319 10:47:44.947660 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0050f5ac_5380_49b0_98ad_fdd7c3b94f51.slice/crio-b28e2d0af306dfdb7546ddf8a39da22b31fb7c2cb9c20294c917bbaddcdf4bbe WatchSource:0}: Error finding container b28e2d0af306dfdb7546ddf8a39da22b31fb7c2cb9c20294c917bbaddcdf4bbe: Status 404 returned error can't find the container with id b28e2d0af306dfdb7546ddf8a39da22b31fb7c2cb9c20294c917bbaddcdf4bbe Mar 19 10:47:44 crc kubenswrapper[4765]: I0319 10:47:44.947551 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm"] Mar 19 10:47:45 crc kubenswrapper[4765]: I0319 10:47:45.079741 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" event={"ID":"0050f5ac-5380-49b0-98ad-fdd7c3b94f51","Type":"ContainerStarted","Data":"b28e2d0af306dfdb7546ddf8a39da22b31fb7c2cb9c20294c917bbaddcdf4bbe"} Mar 19 10:47:46 crc kubenswrapper[4765]: I0319 10:47:46.090332 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" event={"ID":"0050f5ac-5380-49b0-98ad-fdd7c3b94f51","Type":"ContainerStarted","Data":"cc48488fe36e4a6a946465c90997f849f8c3bc2150800a8e896cc284b89df7f3"} Mar 19 10:47:46 crc kubenswrapper[4765]: I0319 10:47:46.112579 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" podStartSLOduration=1.516395451 podStartE2EDuration="2.112558241s" podCreationTimestamp="2026-03-19 10:47:44 +0000 UTC" firstStartedPulling="2026-03-19 10:47:44.951561782 +0000 UTC m=+1563.300507324" lastFinishedPulling="2026-03-19 10:47:45.547724562 +0000 UTC m=+1563.896670114" observedRunningTime="2026-03-19 10:47:46.105850929 +0000 UTC m=+1564.454796481" watchObservedRunningTime="2026-03-19 10:47:46.112558241 +0000 UTC m=+1564.461503783" Mar 19 10:47:57 crc kubenswrapper[4765]: I0319 10:47:57.810776 4765 scope.go:117] "RemoveContainer" containerID="34a946f35cc460e5df13d903d4f2c1bd42982ba98394d7334cb6c9b9790ff5e2" Mar 19 10:47:57 crc kubenswrapper[4765]: I0319 10:47:57.915373 4765 scope.go:117] "RemoveContainer" containerID="405a6fd882b389058788691b6724f0ceecc9e069f7f80c7163dd4f20685d4d51" Mar 19 10:47:57 crc kubenswrapper[4765]: I0319 10:47:57.967207 4765 scope.go:117] "RemoveContainer" containerID="919991393c196c5abb9575f97ee0463433c55869ae3b1a020861c0aab1a44f58" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.129876 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565288-bmzpl"] Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.131723 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.135034 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.135104 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.136382 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.149683 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-bmzpl"] Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.216521 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4dnb\" (UniqueName: \"kubernetes.io/projected/2cbc4a7d-5dd1-4318-95eb-4e2047bfff19-kube-api-access-f4dnb\") pod \"auto-csr-approver-29565288-bmzpl\" (UID: \"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19\") " pod="openshift-infra/auto-csr-approver-29565288-bmzpl" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.319034 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4dnb\" (UniqueName: \"kubernetes.io/projected/2cbc4a7d-5dd1-4318-95eb-4e2047bfff19-kube-api-access-f4dnb\") pod \"auto-csr-approver-29565288-bmzpl\" (UID: \"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19\") " pod="openshift-infra/auto-csr-approver-29565288-bmzpl" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.346928 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4dnb\" (UniqueName: \"kubernetes.io/projected/2cbc4a7d-5dd1-4318-95eb-4e2047bfff19-kube-api-access-f4dnb\") pod \"auto-csr-approver-29565288-bmzpl\" (UID: \"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19\") " pod="openshift-infra/auto-csr-approver-29565288-bmzpl" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.459509 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" Mar 19 10:48:00 crc kubenswrapper[4765]: I0319 10:48:00.890116 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-bmzpl"] Mar 19 10:48:01 crc kubenswrapper[4765]: I0319 10:48:01.239609 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" event={"ID":"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19","Type":"ContainerStarted","Data":"d71e4e4330f4c54d139e77eef6d23b1169d8ee23756aa30d4161cab2c0274b41"} Mar 19 10:48:02 crc kubenswrapper[4765]: I0319 10:48:02.268582 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" event={"ID":"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19","Type":"ContainerStarted","Data":"f7d4a6c1acc5018cdd82df547db6d4304cc5a336afb590dbb8e98e36da2cb4f7"} Mar 19 10:48:02 crc kubenswrapper[4765]: I0319 10:48:02.292529 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" podStartSLOduration=1.298333996 podStartE2EDuration="2.292505034s" podCreationTimestamp="2026-03-19 10:48:00 +0000 UTC" firstStartedPulling="2026-03-19 10:48:00.900294206 +0000 UTC m=+1579.249239758" lastFinishedPulling="2026-03-19 10:48:01.894465254 +0000 UTC m=+1580.243410796" observedRunningTime="2026-03-19 10:48:02.288392182 +0000 UTC m=+1580.637337724" watchObservedRunningTime="2026-03-19 10:48:02.292505034 +0000 UTC m=+1580.641450576" Mar 19 10:48:03 crc kubenswrapper[4765]: I0319 10:48:03.278560 4765 generic.go:334] "Generic (PLEG): container finished" podID="2cbc4a7d-5dd1-4318-95eb-4e2047bfff19" containerID="f7d4a6c1acc5018cdd82df547db6d4304cc5a336afb590dbb8e98e36da2cb4f7" exitCode=0 Mar 19 10:48:03 crc kubenswrapper[4765]: I0319 10:48:03.278654 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" event={"ID":"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19","Type":"ContainerDied","Data":"f7d4a6c1acc5018cdd82df547db6d4304cc5a336afb590dbb8e98e36da2cb4f7"} Mar 19 10:48:04 crc kubenswrapper[4765]: I0319 10:48:04.616112 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" Mar 19 10:48:04 crc kubenswrapper[4765]: I0319 10:48:04.707714 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4dnb\" (UniqueName: \"kubernetes.io/projected/2cbc4a7d-5dd1-4318-95eb-4e2047bfff19-kube-api-access-f4dnb\") pod \"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19\" (UID: \"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19\") " Mar 19 10:48:04 crc kubenswrapper[4765]: I0319 10:48:04.714085 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2cbc4a7d-5dd1-4318-95eb-4e2047bfff19-kube-api-access-f4dnb" (OuterVolumeSpecName: "kube-api-access-f4dnb") pod "2cbc4a7d-5dd1-4318-95eb-4e2047bfff19" (UID: "2cbc4a7d-5dd1-4318-95eb-4e2047bfff19"). InnerVolumeSpecName "kube-api-access-f4dnb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:48:04 crc kubenswrapper[4765]: I0319 10:48:04.810457 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4dnb\" (UniqueName: \"kubernetes.io/projected/2cbc4a7d-5dd1-4318-95eb-4e2047bfff19-kube-api-access-f4dnb\") on node \"crc\" DevicePath \"\"" Mar 19 10:48:05 crc kubenswrapper[4765]: I0319 10:48:05.337487 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" event={"ID":"2cbc4a7d-5dd1-4318-95eb-4e2047bfff19","Type":"ContainerDied","Data":"d71e4e4330f4c54d139e77eef6d23b1169d8ee23756aa30d4161cab2c0274b41"} Mar 19 10:48:05 crc kubenswrapper[4765]: I0319 10:48:05.337538 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d71e4e4330f4c54d139e77eef6d23b1169d8ee23756aa30d4161cab2c0274b41" Mar 19 10:48:05 crc kubenswrapper[4765]: I0319 10:48:05.337663 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565288-bmzpl" Mar 19 10:48:05 crc kubenswrapper[4765]: I0319 10:48:05.349855 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-gnhzm"] Mar 19 10:48:05 crc kubenswrapper[4765]: I0319 10:48:05.359228 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565282-gnhzm"] Mar 19 10:48:06 crc kubenswrapper[4765]: I0319 10:48:06.366709 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4ea0d631-9296-4da6-96c3-a94b068052ed" path="/var/lib/kubelet/pods/4ea0d631-9296-4da6-96c3-a94b068052ed/volumes" Mar 19 10:48:58 crc kubenswrapper[4765]: I0319 10:48:58.098798 4765 scope.go:117] "RemoveContainer" containerID="33fb7a0188d2230ab9609cf706f229cd7f7d88f877b33adcacd7e8705e07dfcb" Mar 19 10:48:58 crc kubenswrapper[4765]: I0319 10:48:58.157157 4765 scope.go:117] "RemoveContainer" containerID="72065889be02d6207cab7936615f505cfd50e1668f8f850ab2de7523c9998cc5" Mar 19 10:49:01 crc kubenswrapper[4765]: I0319 10:49:01.656402 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:49:01 crc kubenswrapper[4765]: I0319 10:49:01.657316 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:49:31 crc kubenswrapper[4765]: I0319 10:49:31.656137 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:49:31 crc kubenswrapper[4765]: I0319 10:49:31.656739 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.386874 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-s46xl"] Mar 19 10:49:40 crc kubenswrapper[4765]: E0319 10:49:40.387539 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2cbc4a7d-5dd1-4318-95eb-4e2047bfff19" containerName="oc" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.387552 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2cbc4a7d-5dd1-4318-95eb-4e2047bfff19" containerName="oc" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.387756 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2cbc4a7d-5dd1-4318-95eb-4e2047bfff19" containerName="oc" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.389039 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.403973 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s46xl"] Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.488429 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-catalog-content\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.488497 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-utilities\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.488583 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8q54x\" (UniqueName: \"kubernetes.io/projected/576c0f10-0461-4cb9-8bfa-69403b491f4a-kube-api-access-8q54x\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.590418 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-catalog-content\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.590525 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-utilities\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.590631 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8q54x\" (UniqueName: \"kubernetes.io/projected/576c0f10-0461-4cb9-8bfa-69403b491f4a-kube-api-access-8q54x\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.591258 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-catalog-content\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.591510 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-utilities\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.617635 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8q54x\" (UniqueName: \"kubernetes.io/projected/576c0f10-0461-4cb9-8bfa-69403b491f4a-kube-api-access-8q54x\") pod \"redhat-marketplace-s46xl\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:40 crc kubenswrapper[4765]: I0319 10:49:40.718195 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:41 crc kubenswrapper[4765]: I0319 10:49:41.203402 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-s46xl"] Mar 19 10:49:42 crc kubenswrapper[4765]: I0319 10:49:42.196212 4765 generic.go:334] "Generic (PLEG): container finished" podID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerID="e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226" exitCode=0 Mar 19 10:49:42 crc kubenswrapper[4765]: I0319 10:49:42.196258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s46xl" event={"ID":"576c0f10-0461-4cb9-8bfa-69403b491f4a","Type":"ContainerDied","Data":"e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226"} Mar 19 10:49:42 crc kubenswrapper[4765]: I0319 10:49:42.196496 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s46xl" event={"ID":"576c0f10-0461-4cb9-8bfa-69403b491f4a","Type":"ContainerStarted","Data":"02485adfc18181d1611140cca7011981be58efe8500d89302bf6c53fae09c5df"} Mar 19 10:49:43 crc kubenswrapper[4765]: I0319 10:49:43.207437 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s46xl" event={"ID":"576c0f10-0461-4cb9-8bfa-69403b491f4a","Type":"ContainerStarted","Data":"215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062"} Mar 19 10:49:44 crc kubenswrapper[4765]: I0319 10:49:44.216604 4765 generic.go:334] "Generic (PLEG): container finished" podID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerID="215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062" exitCode=0 Mar 19 10:49:44 crc kubenswrapper[4765]: I0319 10:49:44.216864 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s46xl" event={"ID":"576c0f10-0461-4cb9-8bfa-69403b491f4a","Type":"ContainerDied","Data":"215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062"} Mar 19 10:49:45 crc kubenswrapper[4765]: I0319 10:49:45.226206 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s46xl" event={"ID":"576c0f10-0461-4cb9-8bfa-69403b491f4a","Type":"ContainerStarted","Data":"09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502"} Mar 19 10:49:45 crc kubenswrapper[4765]: I0319 10:49:45.255754 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-s46xl" podStartSLOduration=2.551864967 podStartE2EDuration="5.255729644s" podCreationTimestamp="2026-03-19 10:49:40 +0000 UTC" firstStartedPulling="2026-03-19 10:49:42.19756889 +0000 UTC m=+1680.546514432" lastFinishedPulling="2026-03-19 10:49:44.901433557 +0000 UTC m=+1683.250379109" observedRunningTime="2026-03-19 10:49:45.24414909 +0000 UTC m=+1683.593094652" watchObservedRunningTime="2026-03-19 10:49:45.255729644 +0000 UTC m=+1683.604675186" Mar 19 10:49:50 crc kubenswrapper[4765]: I0319 10:49:50.718553 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:50 crc kubenswrapper[4765]: I0319 10:49:50.718886 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:50 crc kubenswrapper[4765]: I0319 10:49:50.772068 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:51 crc kubenswrapper[4765]: I0319 10:49:51.332658 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:51 crc kubenswrapper[4765]: I0319 10:49:51.387376 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s46xl"] Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.307588 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-s46xl" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="registry-server" containerID="cri-o://09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502" gracePeriod=2 Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.776513 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.855710 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-catalog-content\") pod \"576c0f10-0461-4cb9-8bfa-69403b491f4a\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.855798 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8q54x\" (UniqueName: \"kubernetes.io/projected/576c0f10-0461-4cb9-8bfa-69403b491f4a-kube-api-access-8q54x\") pod \"576c0f10-0461-4cb9-8bfa-69403b491f4a\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.856066 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-utilities\") pod \"576c0f10-0461-4cb9-8bfa-69403b491f4a\" (UID: \"576c0f10-0461-4cb9-8bfa-69403b491f4a\") " Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.856918 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-utilities" (OuterVolumeSpecName: "utilities") pod "576c0f10-0461-4cb9-8bfa-69403b491f4a" (UID: "576c0f10-0461-4cb9-8bfa-69403b491f4a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.863159 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/576c0f10-0461-4cb9-8bfa-69403b491f4a-kube-api-access-8q54x" (OuterVolumeSpecName: "kube-api-access-8q54x") pod "576c0f10-0461-4cb9-8bfa-69403b491f4a" (UID: "576c0f10-0461-4cb9-8bfa-69403b491f4a"). InnerVolumeSpecName "kube-api-access-8q54x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.882443 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "576c0f10-0461-4cb9-8bfa-69403b491f4a" (UID: "576c0f10-0461-4cb9-8bfa-69403b491f4a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.962497 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.962537 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/576c0f10-0461-4cb9-8bfa-69403b491f4a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:49:53 crc kubenswrapper[4765]: I0319 10:49:53.962548 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8q54x\" (UniqueName: \"kubernetes.io/projected/576c0f10-0461-4cb9-8bfa-69403b491f4a-kube-api-access-8q54x\") on node \"crc\" DevicePath \"\"" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.317140 4765 generic.go:334] "Generic (PLEG): container finished" podID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerID="09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502" exitCode=0 Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.317193 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-s46xl" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.317202 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s46xl" event={"ID":"576c0f10-0461-4cb9-8bfa-69403b491f4a","Type":"ContainerDied","Data":"09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502"} Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.317254 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-s46xl" event={"ID":"576c0f10-0461-4cb9-8bfa-69403b491f4a","Type":"ContainerDied","Data":"02485adfc18181d1611140cca7011981be58efe8500d89302bf6c53fae09c5df"} Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.317274 4765 scope.go:117] "RemoveContainer" containerID="09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.335479 4765 scope.go:117] "RemoveContainer" containerID="215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.370298 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-s46xl"] Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.370341 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-s46xl"] Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.375213 4765 scope.go:117] "RemoveContainer" containerID="e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.403634 4765 scope.go:117] "RemoveContainer" containerID="09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502" Mar 19 10:49:54 crc kubenswrapper[4765]: E0319 10:49:54.404075 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502\": container with ID starting with 09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502 not found: ID does not exist" containerID="09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.404117 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502"} err="failed to get container status \"09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502\": rpc error: code = NotFound desc = could not find container \"09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502\": container with ID starting with 09fb18342cc019977e44e6e0bd940c5c5b6a50661645af1571240343250c5502 not found: ID does not exist" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.404143 4765 scope.go:117] "RemoveContainer" containerID="215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062" Mar 19 10:49:54 crc kubenswrapper[4765]: E0319 10:49:54.404385 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062\": container with ID starting with 215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062 not found: ID does not exist" containerID="215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.404469 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062"} err="failed to get container status \"215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062\": rpc error: code = NotFound desc = could not find container \"215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062\": container with ID starting with 215a39d3ba894f446ca491cf62ed5ca73126de87497bc558ebcce9d2da897062 not found: ID does not exist" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.404541 4765 scope.go:117] "RemoveContainer" containerID="e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226" Mar 19 10:49:54 crc kubenswrapper[4765]: E0319 10:49:54.405034 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226\": container with ID starting with e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226 not found: ID does not exist" containerID="e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226" Mar 19 10:49:54 crc kubenswrapper[4765]: I0319 10:49:54.405065 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226"} err="failed to get container status \"e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226\": rpc error: code = NotFound desc = could not find container \"e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226\": container with ID starting with e1c3304ca76b0b3f63176149a58e37e2a67f9b66112c741b024b2b246b1ff226 not found: ID does not exist" Mar 19 10:49:56 crc kubenswrapper[4765]: I0319 10:49:56.369394 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" path="/var/lib/kubelet/pods/576c0f10-0461-4cb9-8bfa-69403b491f4a/volumes" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.146146 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565290-zt5xt"] Mar 19 10:50:00 crc kubenswrapper[4765]: E0319 10:50:00.149092 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="extract-content" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.149135 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="extract-content" Mar 19 10:50:00 crc kubenswrapper[4765]: E0319 10:50:00.149176 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="registry-server" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.149186 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="registry-server" Mar 19 10:50:00 crc kubenswrapper[4765]: E0319 10:50:00.149208 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="extract-utilities" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.149223 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="extract-utilities" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.149587 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="576c0f10-0461-4cb9-8bfa-69403b491f4a" containerName="registry-server" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.150541 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-zt5xt" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.152850 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.153205 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.158760 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.160455 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-zt5xt"] Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.195115 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x8dlh\" (UniqueName: \"kubernetes.io/projected/057f3e2f-f77b-4da3-a67b-4f0777602577-kube-api-access-x8dlh\") pod \"auto-csr-approver-29565290-zt5xt\" (UID: \"057f3e2f-f77b-4da3-a67b-4f0777602577\") " pod="openshift-infra/auto-csr-approver-29565290-zt5xt" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.297092 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x8dlh\" (UniqueName: \"kubernetes.io/projected/057f3e2f-f77b-4da3-a67b-4f0777602577-kube-api-access-x8dlh\") pod \"auto-csr-approver-29565290-zt5xt\" (UID: \"057f3e2f-f77b-4da3-a67b-4f0777602577\") " pod="openshift-infra/auto-csr-approver-29565290-zt5xt" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.316623 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x8dlh\" (UniqueName: \"kubernetes.io/projected/057f3e2f-f77b-4da3-a67b-4f0777602577-kube-api-access-x8dlh\") pod \"auto-csr-approver-29565290-zt5xt\" (UID: \"057f3e2f-f77b-4da3-a67b-4f0777602577\") " pod="openshift-infra/auto-csr-approver-29565290-zt5xt" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.471773 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-zt5xt" Mar 19 10:50:00 crc kubenswrapper[4765]: I0319 10:50:00.964309 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-zt5xt"] Mar 19 10:50:01 crc kubenswrapper[4765]: I0319 10:50:01.383073 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565290-zt5xt" event={"ID":"057f3e2f-f77b-4da3-a67b-4f0777602577","Type":"ContainerStarted","Data":"a61a6c0ab8d4c9537d602da7b7cca1813cfda379765f2648f15b7567128f479c"} Mar 19 10:50:01 crc kubenswrapper[4765]: I0319 10:50:01.656505 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:50:01 crc kubenswrapper[4765]: I0319 10:50:01.656900 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:50:01 crc kubenswrapper[4765]: I0319 10:50:01.656975 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:50:01 crc kubenswrapper[4765]: I0319 10:50:01.657784 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:50:01 crc kubenswrapper[4765]: I0319 10:50:01.657855 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" gracePeriod=600 Mar 19 10:50:01 crc kubenswrapper[4765]: E0319 10:50:01.783878 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:50:02 crc kubenswrapper[4765]: I0319 10:50:02.400187 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" exitCode=0 Mar 19 10:50:02 crc kubenswrapper[4765]: I0319 10:50:02.400246 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984"} Mar 19 10:50:02 crc kubenswrapper[4765]: I0319 10:50:02.400289 4765 scope.go:117] "RemoveContainer" containerID="7fbbabc77237677f702271306a25be40ef78a15b44ac1218092fa412c82ce0c1" Mar 19 10:50:02 crc kubenswrapper[4765]: I0319 10:50:02.401131 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:50:02 crc kubenswrapper[4765]: E0319 10:50:02.401404 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:50:03 crc kubenswrapper[4765]: I0319 10:50:03.416429 4765 generic.go:334] "Generic (PLEG): container finished" podID="057f3e2f-f77b-4da3-a67b-4f0777602577" containerID="09c9fea2cef66298e6fbd2673086ad3b083adde2f4a2e45fb7f67dcb13962309" exitCode=0 Mar 19 10:50:03 crc kubenswrapper[4765]: I0319 10:50:03.416532 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565290-zt5xt" event={"ID":"057f3e2f-f77b-4da3-a67b-4f0777602577","Type":"ContainerDied","Data":"09c9fea2cef66298e6fbd2673086ad3b083adde2f4a2e45fb7f67dcb13962309"} Mar 19 10:50:04 crc kubenswrapper[4765]: I0319 10:50:04.765659 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-zt5xt" Mar 19 10:50:04 crc kubenswrapper[4765]: I0319 10:50:04.883480 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x8dlh\" (UniqueName: \"kubernetes.io/projected/057f3e2f-f77b-4da3-a67b-4f0777602577-kube-api-access-x8dlh\") pod \"057f3e2f-f77b-4da3-a67b-4f0777602577\" (UID: \"057f3e2f-f77b-4da3-a67b-4f0777602577\") " Mar 19 10:50:04 crc kubenswrapper[4765]: I0319 10:50:04.892454 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/057f3e2f-f77b-4da3-a67b-4f0777602577-kube-api-access-x8dlh" (OuterVolumeSpecName: "kube-api-access-x8dlh") pod "057f3e2f-f77b-4da3-a67b-4f0777602577" (UID: "057f3e2f-f77b-4da3-a67b-4f0777602577"). InnerVolumeSpecName "kube-api-access-x8dlh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:50:04 crc kubenswrapper[4765]: I0319 10:50:04.985546 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x8dlh\" (UniqueName: \"kubernetes.io/projected/057f3e2f-f77b-4da3-a67b-4f0777602577-kube-api-access-x8dlh\") on node \"crc\" DevicePath \"\"" Mar 19 10:50:05 crc kubenswrapper[4765]: I0319 10:50:05.438923 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565290-zt5xt" event={"ID":"057f3e2f-f77b-4da3-a67b-4f0777602577","Type":"ContainerDied","Data":"a61a6c0ab8d4c9537d602da7b7cca1813cfda379765f2648f15b7567128f479c"} Mar 19 10:50:05 crc kubenswrapper[4765]: I0319 10:50:05.438987 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a61a6c0ab8d4c9537d602da7b7cca1813cfda379765f2648f15b7567128f479c" Mar 19 10:50:05 crc kubenswrapper[4765]: I0319 10:50:05.439039 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565290-zt5xt" Mar 19 10:50:05 crc kubenswrapper[4765]: I0319 10:50:05.833459 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-5vl4n"] Mar 19 10:50:05 crc kubenswrapper[4765]: I0319 10:50:05.844157 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565284-5vl4n"] Mar 19 10:50:06 crc kubenswrapper[4765]: I0319 10:50:06.368147 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="183750a8-660e-41fd-85f8-6deb34af3c2c" path="/var/lib/kubelet/pods/183750a8-660e-41fd-85f8-6deb34af3c2c/volumes" Mar 19 10:50:17 crc kubenswrapper[4765]: I0319 10:50:17.356306 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:50:17 crc kubenswrapper[4765]: E0319 10:50:17.357044 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:50:29 crc kubenswrapper[4765]: I0319 10:50:29.356195 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:50:29 crc kubenswrapper[4765]: E0319 10:50:29.357011 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:50:41 crc kubenswrapper[4765]: I0319 10:50:41.356204 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:50:41 crc kubenswrapper[4765]: E0319 10:50:41.357215 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:50:47 crc kubenswrapper[4765]: I0319 10:50:47.859508 4765 generic.go:334] "Generic (PLEG): container finished" podID="0050f5ac-5380-49b0-98ad-fdd7c3b94f51" containerID="cc48488fe36e4a6a946465c90997f849f8c3bc2150800a8e896cc284b89df7f3" exitCode=0 Mar 19 10:50:47 crc kubenswrapper[4765]: I0319 10:50:47.859605 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" event={"ID":"0050f5ac-5380-49b0-98ad-fdd7c3b94f51","Type":"ContainerDied","Data":"cc48488fe36e4a6a946465c90997f849f8c3bc2150800a8e896cc284b89df7f3"} Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.374646 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.562911 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-brsd2\" (UniqueName: \"kubernetes.io/projected/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-kube-api-access-brsd2\") pod \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.563076 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-bootstrap-combined-ca-bundle\") pod \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.563203 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-ssh-key-openstack-edpm-ipam\") pod \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.563253 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-inventory\") pod \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\" (UID: \"0050f5ac-5380-49b0-98ad-fdd7c3b94f51\") " Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.574317 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-kube-api-access-brsd2" (OuterVolumeSpecName: "kube-api-access-brsd2") pod "0050f5ac-5380-49b0-98ad-fdd7c3b94f51" (UID: "0050f5ac-5380-49b0-98ad-fdd7c3b94f51"). InnerVolumeSpecName "kube-api-access-brsd2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.593289 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "0050f5ac-5380-49b0-98ad-fdd7c3b94f51" (UID: "0050f5ac-5380-49b0-98ad-fdd7c3b94f51"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.619118 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0050f5ac-5380-49b0-98ad-fdd7c3b94f51" (UID: "0050f5ac-5380-49b0-98ad-fdd7c3b94f51"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.623263 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-inventory" (OuterVolumeSpecName: "inventory") pod "0050f5ac-5380-49b0-98ad-fdd7c3b94f51" (UID: "0050f5ac-5380-49b0-98ad-fdd7c3b94f51"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.664989 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.665022 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-brsd2\" (UniqueName: \"kubernetes.io/projected/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-kube-api-access-brsd2\") on node \"crc\" DevicePath \"\"" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.665033 4765 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.665044 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0050f5ac-5380-49b0-98ad-fdd7c3b94f51-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.877580 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" event={"ID":"0050f5ac-5380-49b0-98ad-fdd7c3b94f51","Type":"ContainerDied","Data":"b28e2d0af306dfdb7546ddf8a39da22b31fb7c2cb9c20294c917bbaddcdf4bbe"} Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.877619 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b28e2d0af306dfdb7546ddf8a39da22b31fb7c2cb9c20294c917bbaddcdf4bbe" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.877638 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.993630 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr"] Mar 19 10:50:49 crc kubenswrapper[4765]: E0319 10:50:49.994161 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0050f5ac-5380-49b0-98ad-fdd7c3b94f51" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.994177 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0050f5ac-5380-49b0-98ad-fdd7c3b94f51" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 10:50:49 crc kubenswrapper[4765]: E0319 10:50:49.994193 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="057f3e2f-f77b-4da3-a67b-4f0777602577" containerName="oc" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.994202 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="057f3e2f-f77b-4da3-a67b-4f0777602577" containerName="oc" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.994422 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0050f5ac-5380-49b0-98ad-fdd7c3b94f51" containerName="bootstrap-edpm-deployment-openstack-edpm-ipam" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.994450 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="057f3e2f-f77b-4da3-a67b-4f0777602577" containerName="oc" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.995162 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.998119 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.998355 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.998610 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:50:49 crc kubenswrapper[4765]: I0319 10:50:49.998752 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.004021 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr"] Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.174407 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.174794 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.174884 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqbtr\" (UniqueName: \"kubernetes.io/projected/5381bac5-1b71-4489-97fd-c49d0ae1783b-kube-api-access-bqbtr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.276495 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.276660 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.276690 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqbtr\" (UniqueName: \"kubernetes.io/projected/5381bac5-1b71-4489-97fd-c49d0ae1783b-kube-api-access-bqbtr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.283108 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-ssh-key-openstack-edpm-ipam\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.290751 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-inventory\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.291370 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqbtr\" (UniqueName: \"kubernetes.io/projected/5381bac5-1b71-4489-97fd-c49d0ae1783b-kube-api-access-bqbtr\") pod \"download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.312136 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.795067 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr"] Mar 19 10:50:50 crc kubenswrapper[4765]: I0319 10:50:50.888837 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" event={"ID":"5381bac5-1b71-4489-97fd-c49d0ae1783b","Type":"ContainerStarted","Data":"3377c3eda57c1f70fcd09f4670f1961915239b8aeea1ab99be417041eb410195"} Mar 19 10:50:52 crc kubenswrapper[4765]: I0319 10:50:52.363351 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:50:52 crc kubenswrapper[4765]: E0319 10:50:52.363850 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:50:52 crc kubenswrapper[4765]: I0319 10:50:52.907069 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" event={"ID":"5381bac5-1b71-4489-97fd-c49d0ae1783b","Type":"ContainerStarted","Data":"7432ccfee481f8c559dd06807463ab2c712740eb431799d9db885d3a81ea8d6c"} Mar 19 10:50:52 crc kubenswrapper[4765]: I0319 10:50:52.933432 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" podStartSLOduration=2.4086664239999998 podStartE2EDuration="3.933413985s" podCreationTimestamp="2026-03-19 10:50:49 +0000 UTC" firstStartedPulling="2026-03-19 10:50:50.805923169 +0000 UTC m=+1749.154868701" lastFinishedPulling="2026-03-19 10:50:52.33067072 +0000 UTC m=+1750.679616262" observedRunningTime="2026-03-19 10:50:52.920483705 +0000 UTC m=+1751.269429267" watchObservedRunningTime="2026-03-19 10:50:52.933413985 +0000 UTC m=+1751.282359527" Mar 19 10:50:58 crc kubenswrapper[4765]: I0319 10:50:58.309151 4765 scope.go:117] "RemoveContainer" containerID="c0ef87341cad1856dc8776dcbfd142ffab12f6f170e88b3861bb452ff00eda65" Mar 19 10:51:07 crc kubenswrapper[4765]: I0319 10:51:07.356674 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:51:07 crc kubenswrapper[4765]: E0319 10:51:07.357577 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:51:12 crc kubenswrapper[4765]: I0319 10:51:12.051888 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-create-gn9lj"] Mar 19 10:51:12 crc kubenswrapper[4765]: I0319 10:51:12.064356 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-505a-account-create-update-gp6pz"] Mar 19 10:51:12 crc kubenswrapper[4765]: I0319 10:51:12.092172 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-create-gn9lj"] Mar 19 10:51:12 crc kubenswrapper[4765]: I0319 10:51:12.106435 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-505a-account-create-update-gp6pz"] Mar 19 10:51:12 crc kubenswrapper[4765]: I0319 10:51:12.369120 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c" path="/var/lib/kubelet/pods/4a28c70e-23a3-4483-83a7-d8c3bf7a8f3c/volumes" Mar 19 10:51:12 crc kubenswrapper[4765]: I0319 10:51:12.371989 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="777c1144-ff82-4ac0-a887-e5859bccf142" path="/var/lib/kubelet/pods/777c1144-ff82-4ac0-a887-e5859bccf142/volumes" Mar 19 10:51:13 crc kubenswrapper[4765]: I0319 10:51:13.027123 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-create-gftrk"] Mar 19 10:51:13 crc kubenswrapper[4765]: I0319 10:51:13.039290 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-create-gftrk"] Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.048311 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-create-25h9b"] Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.061477 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-3f4d-account-create-update-qmrd4"] Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.072606 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-eb4b-account-create-update-jtnjd"] Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.081236 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-create-25h9b"] Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.089082 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-3f4d-account-create-update-qmrd4"] Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.097384 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-eb4b-account-create-update-jtnjd"] Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.366221 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31509f20-2f87-4039-b2fc-7a65a11e34e8" path="/var/lib/kubelet/pods/31509f20-2f87-4039-b2fc-7a65a11e34e8/volumes" Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.369691 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cca3085-ce1d-43c5-ada0-89d57e6ce578" path="/var/lib/kubelet/pods/3cca3085-ce1d-43c5-ada0-89d57e6ce578/volumes" Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.372218 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cb59bf52-1cd4-4d26-9c5d-9ee4561267c5" path="/var/lib/kubelet/pods/cb59bf52-1cd4-4d26-9c5d-9ee4561267c5/volumes" Mar 19 10:51:14 crc kubenswrapper[4765]: I0319 10:51:14.374861 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc" path="/var/lib/kubelet/pods/df5a261f-bdce-48c2-b3d1-9c52b2b4b0dc/volumes" Mar 19 10:51:18 crc kubenswrapper[4765]: I0319 10:51:18.359473 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:51:18 crc kubenswrapper[4765]: E0319 10:51:18.360016 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:51:32 crc kubenswrapper[4765]: I0319 10:51:32.363030 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:51:32 crc kubenswrapper[4765]: E0319 10:51:32.363800 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:51:46 crc kubenswrapper[4765]: I0319 10:51:46.357165 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:51:46 crc kubenswrapper[4765]: E0319 10:51:46.357924 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:51:47 crc kubenswrapper[4765]: I0319 10:51:47.039538 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/root-account-create-update-zqlg2"] Mar 19 10:51:47 crc kubenswrapper[4765]: I0319 10:51:47.048609 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/root-account-create-update-zqlg2"] Mar 19 10:51:48 crc kubenswrapper[4765]: I0319 10:51:48.386077 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c4e7a7f0-2aaf-4929-89aa-c96424bfca68" path="/var/lib/kubelet/pods/c4e7a7f0-2aaf-4929-89aa-c96424bfca68/volumes" Mar 19 10:51:55 crc kubenswrapper[4765]: I0319 10:51:55.039287 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/glance-db-sync-ds8t6"] Mar 19 10:51:55 crc kubenswrapper[4765]: I0319 10:51:55.047891 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/glance-db-sync-ds8t6"] Mar 19 10:51:56 crc kubenswrapper[4765]: I0319 10:51:56.367317 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0" path="/var/lib/kubelet/pods/06ef9738-82a7-4b48-88c4-ef7fa8ee3cf0/volumes" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.355542 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:51:58 crc kubenswrapper[4765]: E0319 10:51:58.356103 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.385790 4765 scope.go:117] "RemoveContainer" containerID="1876e2731ada5fcbc3fe82577681e2b96f37a56a78f1c4731475777f13f08b23" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.429996 4765 scope.go:117] "RemoveContainer" containerID="319a3617a109994e7fdcc8484dc49848d8ba20de19e30ee60a87eef1b0301cb5" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.469523 4765 scope.go:117] "RemoveContainer" containerID="42d5747a0b80f79f343538a6daea1710955b1ade686b916eb415f3186624c37a" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.515284 4765 scope.go:117] "RemoveContainer" containerID="25d836aaf60cf437302090a0c8a573627f829543e9d3b69586e24d3139666592" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.559514 4765 scope.go:117] "RemoveContainer" containerID="ee29eba42d553c93953418fe06ecbf9d6ed4b1cd71a398e57f4359ab6ab93961" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.607117 4765 scope.go:117] "RemoveContainer" containerID="5f13f02cae162b12e1daf9f9d85ecaa6386d6acdeb2214a434274b38d635c0fd" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.639177 4765 scope.go:117] "RemoveContainer" containerID="ea969d495984141102882c64dee73fec4a6bf28cfc97fd0ba8ed66f28b209303" Mar 19 10:51:58 crc kubenswrapper[4765]: I0319 10:51:58.659052 4765 scope.go:117] "RemoveContainer" containerID="1cc3d8725f02e33192f2fee9bfc932918d270bd09ff573e83da6f36acd68e948" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.150119 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565292-w8fpg"] Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.151814 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-w8fpg" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.154891 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.155443 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.155684 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.160638 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-w8fpg"] Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.232761 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-prnhv\" (UniqueName: \"kubernetes.io/projected/257ec543-fc43-45ea-b218-5c7772da983f-kube-api-access-prnhv\") pod \"auto-csr-approver-29565292-w8fpg\" (UID: \"257ec543-fc43-45ea-b218-5c7772da983f\") " pod="openshift-infra/auto-csr-approver-29565292-w8fpg" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.335969 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-prnhv\" (UniqueName: \"kubernetes.io/projected/257ec543-fc43-45ea-b218-5c7772da983f-kube-api-access-prnhv\") pod \"auto-csr-approver-29565292-w8fpg\" (UID: \"257ec543-fc43-45ea-b218-5c7772da983f\") " pod="openshift-infra/auto-csr-approver-29565292-w8fpg" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.356981 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-prnhv\" (UniqueName: \"kubernetes.io/projected/257ec543-fc43-45ea-b218-5c7772da983f-kube-api-access-prnhv\") pod \"auto-csr-approver-29565292-w8fpg\" (UID: \"257ec543-fc43-45ea-b218-5c7772da983f\") " pod="openshift-infra/auto-csr-approver-29565292-w8fpg" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.471593 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-w8fpg" Mar 19 10:52:00 crc kubenswrapper[4765]: I0319 10:52:00.937502 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-w8fpg"] Mar 19 10:52:01 crc kubenswrapper[4765]: I0319 10:52:01.039527 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-create-6fk75"] Mar 19 10:52:01 crc kubenswrapper[4765]: I0319 10:52:01.051807 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-create-6fk75"] Mar 19 10:52:01 crc kubenswrapper[4765]: I0319 10:52:01.555946 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565292-w8fpg" event={"ID":"257ec543-fc43-45ea-b218-5c7772da983f","Type":"ContainerStarted","Data":"b222c79f780d171c0ce0b381e136ab588bcaf867dcf328d4df3553e2685abf52"} Mar 19 10:52:02 crc kubenswrapper[4765]: I0319 10:52:02.029845 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-bedf-account-create-update-s9kdd"] Mar 19 10:52:02 crc kubenswrapper[4765]: I0319 10:52:02.053702 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-bedf-account-create-update-s9kdd"] Mar 19 10:52:02 crc kubenswrapper[4765]: I0319 10:52:02.379350 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="549655a2-d327-424f-ae72-6491fa466bdd" path="/var/lib/kubelet/pods/549655a2-d327-424f-ae72-6491fa466bdd/volumes" Mar 19 10:52:02 crc kubenswrapper[4765]: I0319 10:52:02.389239 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4914033-41f9-467d-b55e-c1d89a8fab4b" path="/var/lib/kubelet/pods/a4914033-41f9-467d-b55e-c1d89a8fab4b/volumes" Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.047703 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-507b-account-create-update-jxpl9"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.062504 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-31fd-account-create-update-9b8kl"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.073802 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-create-h2h92"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.083274 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-create-6jl85"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.091842 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-507b-account-create-update-jxpl9"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.100657 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-31fd-account-create-update-9b8kl"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.111059 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-create-h2h92"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.119842 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-create-6jl85"] Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.574326 4765 generic.go:334] "Generic (PLEG): container finished" podID="257ec543-fc43-45ea-b218-5c7772da983f" containerID="1449aac9dcfd24463c7390cacd543a0362a3d57e756744468f966b2a950137ae" exitCode=0 Mar 19 10:52:03 crc kubenswrapper[4765]: I0319 10:52:03.574373 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565292-w8fpg" event={"ID":"257ec543-fc43-45ea-b218-5c7772da983f","Type":"ContainerDied","Data":"1449aac9dcfd24463c7390cacd543a0362a3d57e756744468f966b2a950137ae"} Mar 19 10:52:04 crc kubenswrapper[4765]: I0319 10:52:04.366855 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="36534f7f-ff2e-45f0-8fc1-827ae9ebd32b" path="/var/lib/kubelet/pods/36534f7f-ff2e-45f0-8fc1-827ae9ebd32b/volumes" Mar 19 10:52:04 crc kubenswrapper[4765]: I0319 10:52:04.368771 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="453b3d54-509f-4f11-a718-0bd8e271953e" path="/var/lib/kubelet/pods/453b3d54-509f-4f11-a718-0bd8e271953e/volumes" Mar 19 10:52:04 crc kubenswrapper[4765]: I0319 10:52:04.369599 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7f66d34e-14df-45dc-b7a6-8a83c4f6b19f" path="/var/lib/kubelet/pods/7f66d34e-14df-45dc-b7a6-8a83c4f6b19f/volumes" Mar 19 10:52:04 crc kubenswrapper[4765]: I0319 10:52:04.371126 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95479eb2-1db7-4def-a4da-5ce9dbf85e13" path="/var/lib/kubelet/pods/95479eb2-1db7-4def-a4da-5ce9dbf85e13/volumes" Mar 19 10:52:04 crc kubenswrapper[4765]: I0319 10:52:04.891521 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-w8fpg" Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.027800 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-prnhv\" (UniqueName: \"kubernetes.io/projected/257ec543-fc43-45ea-b218-5c7772da983f-kube-api-access-prnhv\") pod \"257ec543-fc43-45ea-b218-5c7772da983f\" (UID: \"257ec543-fc43-45ea-b218-5c7772da983f\") " Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.035321 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/257ec543-fc43-45ea-b218-5c7772da983f-kube-api-access-prnhv" (OuterVolumeSpecName: "kube-api-access-prnhv") pod "257ec543-fc43-45ea-b218-5c7772da983f" (UID: "257ec543-fc43-45ea-b218-5c7772da983f"). InnerVolumeSpecName "kube-api-access-prnhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.130815 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-prnhv\" (UniqueName: \"kubernetes.io/projected/257ec543-fc43-45ea-b218-5c7772da983f-kube-api-access-prnhv\") on node \"crc\" DevicePath \"\"" Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.591405 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565292-w8fpg" event={"ID":"257ec543-fc43-45ea-b218-5c7772da983f","Type":"ContainerDied","Data":"b222c79f780d171c0ce0b381e136ab588bcaf867dcf328d4df3553e2685abf52"} Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.591450 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565292-w8fpg" Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.591462 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b222c79f780d171c0ce0b381e136ab588bcaf867dcf328d4df3553e2685abf52" Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.945676 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-cjfhr"] Mar 19 10:52:05 crc kubenswrapper[4765]: I0319 10:52:05.955520 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565286-cjfhr"] Mar 19 10:52:06 crc kubenswrapper[4765]: I0319 10:52:06.368746 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="741b1716-e212-46d2-b388-41d14845f2b0" path="/var/lib/kubelet/pods/741b1716-e212-46d2-b388-41d14845f2b0/volumes" Mar 19 10:52:09 crc kubenswrapper[4765]: I0319 10:52:09.356842 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:52:09 crc kubenswrapper[4765]: E0319 10:52:09.357378 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:52:14 crc kubenswrapper[4765]: I0319 10:52:14.030700 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-db-sync-597fd"] Mar 19 10:52:14 crc kubenswrapper[4765]: I0319 10:52:14.041190 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-db-sync-597fd"] Mar 19 10:52:14 crc kubenswrapper[4765]: I0319 10:52:14.367353 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4216cf70-5a7a-4d05-8c3d-20c4af295ac4" path="/var/lib/kubelet/pods/4216cf70-5a7a-4d05-8c3d-20c4af295ac4/volumes" Mar 19 10:52:23 crc kubenswrapper[4765]: I0319 10:52:23.356544 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:52:23 crc kubenswrapper[4765]: E0319 10:52:23.357905 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:52:37 crc kubenswrapper[4765]: I0319 10:52:37.356326 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:52:37 crc kubenswrapper[4765]: E0319 10:52:37.357145 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:52:43 crc kubenswrapper[4765]: I0319 10:52:43.925173 4765 generic.go:334] "Generic (PLEG): container finished" podID="5381bac5-1b71-4489-97fd-c49d0ae1783b" containerID="7432ccfee481f8c559dd06807463ab2c712740eb431799d9db885d3a81ea8d6c" exitCode=0 Mar 19 10:52:43 crc kubenswrapper[4765]: I0319 10:52:43.925302 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" event={"ID":"5381bac5-1b71-4489-97fd-c49d0ae1783b","Type":"ContainerDied","Data":"7432ccfee481f8c559dd06807463ab2c712740eb431799d9db885d3a81ea8d6c"} Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.365500 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.471237 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-inventory\") pod \"5381bac5-1b71-4489-97fd-c49d0ae1783b\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.471471 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqbtr\" (UniqueName: \"kubernetes.io/projected/5381bac5-1b71-4489-97fd-c49d0ae1783b-kube-api-access-bqbtr\") pod \"5381bac5-1b71-4489-97fd-c49d0ae1783b\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.471539 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-ssh-key-openstack-edpm-ipam\") pod \"5381bac5-1b71-4489-97fd-c49d0ae1783b\" (UID: \"5381bac5-1b71-4489-97fd-c49d0ae1783b\") " Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.488234 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5381bac5-1b71-4489-97fd-c49d0ae1783b-kube-api-access-bqbtr" (OuterVolumeSpecName: "kube-api-access-bqbtr") pod "5381bac5-1b71-4489-97fd-c49d0ae1783b" (UID: "5381bac5-1b71-4489-97fd-c49d0ae1783b"). InnerVolumeSpecName "kube-api-access-bqbtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.526866 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-inventory" (OuterVolumeSpecName: "inventory") pod "5381bac5-1b71-4489-97fd-c49d0ae1783b" (UID: "5381bac5-1b71-4489-97fd-c49d0ae1783b"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.533242 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5381bac5-1b71-4489-97fd-c49d0ae1783b" (UID: "5381bac5-1b71-4489-97fd-c49d0ae1783b"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.579596 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqbtr\" (UniqueName: \"kubernetes.io/projected/5381bac5-1b71-4489-97fd-c49d0ae1783b-kube-api-access-bqbtr\") on node \"crc\" DevicePath \"\"" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.579647 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.579663 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5381bac5-1b71-4489-97fd-c49d0ae1783b-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.944487 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" event={"ID":"5381bac5-1b71-4489-97fd-c49d0ae1783b","Type":"ContainerDied","Data":"3377c3eda57c1f70fcd09f4670f1961915239b8aeea1ab99be417041eb410195"} Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.944531 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3377c3eda57c1f70fcd09f4670f1961915239b8aeea1ab99be417041eb410195" Mar 19 10:52:45 crc kubenswrapper[4765]: I0319 10:52:45.944577 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.034844 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw"] Mar 19 10:52:46 crc kubenswrapper[4765]: E0319 10:52:46.037230 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="257ec543-fc43-45ea-b218-5c7772da983f" containerName="oc" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.037264 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="257ec543-fc43-45ea-b218-5c7772da983f" containerName="oc" Mar 19 10:52:46 crc kubenswrapper[4765]: E0319 10:52:46.037288 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5381bac5-1b71-4489-97fd-c49d0ae1783b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.037298 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5381bac5-1b71-4489-97fd-c49d0ae1783b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.037563 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5381bac5-1b71-4489-97fd-c49d0ae1783b" containerName="download-cache-edpm-deployment-openstack-edpm-ipam" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.037595 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="257ec543-fc43-45ea-b218-5c7772da983f" containerName="oc" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.038477 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.040859 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.042219 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.045167 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.045409 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.056671 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw"] Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.191058 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4twkx\" (UniqueName: \"kubernetes.io/projected/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-kube-api-access-4twkx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.191256 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.191364 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.293501 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.293581 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.293756 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4twkx\" (UniqueName: \"kubernetes.io/projected/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-kube-api-access-4twkx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.306759 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-ssh-key-openstack-edpm-ipam\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.306835 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-inventory\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.313735 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4twkx\" (UniqueName: \"kubernetes.io/projected/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-kube-api-access-4twkx\") pod \"configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.355806 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.705806 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.709126 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw"] Mar 19 10:52:46 crc kubenswrapper[4765]: I0319 10:52:46.956814 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" event={"ID":"fd32b580-78f7-478e-ba1d-9d1a86b75f3a","Type":"ContainerStarted","Data":"c62304cbac352a655322eb030bd8e0f75615be500f7d1c79ec3ca9520a3bdcfd"} Mar 19 10:52:47 crc kubenswrapper[4765]: I0319 10:52:47.967431 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" event={"ID":"fd32b580-78f7-478e-ba1d-9d1a86b75f3a","Type":"ContainerStarted","Data":"99bbe0da6b2fce87ecb7a804769f7b38c75b30dfdb9e448be8dd7dcef611021e"} Mar 19 10:52:47 crc kubenswrapper[4765]: I0319 10:52:47.991502 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" podStartSLOduration=1.464122802 podStartE2EDuration="1.991485155s" podCreationTimestamp="2026-03-19 10:52:46 +0000 UTC" firstStartedPulling="2026-03-19 10:52:46.705562865 +0000 UTC m=+1865.054508407" lastFinishedPulling="2026-03-19 10:52:47.232925218 +0000 UTC m=+1865.581870760" observedRunningTime="2026-03-19 10:52:47.98169911 +0000 UTC m=+1866.330644662" watchObservedRunningTime="2026-03-19 10:52:47.991485155 +0000 UTC m=+1866.340430697" Mar 19 10:52:52 crc kubenswrapper[4765]: I0319 10:52:52.363510 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:52:52 crc kubenswrapper[4765]: E0319 10:52:52.364358 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:52:57 crc kubenswrapper[4765]: I0319 10:52:57.050224 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/keystone-bootstrap-j8hqn"] Mar 19 10:52:57 crc kubenswrapper[4765]: I0319 10:52:57.061649 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/keystone-bootstrap-j8hqn"] Mar 19 10:52:57 crc kubenswrapper[4765]: I0319 10:52:57.073987 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/placement-db-sync-q2fr9"] Mar 19 10:52:57 crc kubenswrapper[4765]: I0319 10:52:57.081514 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/placement-db-sync-q2fr9"] Mar 19 10:52:58 crc kubenswrapper[4765]: I0319 10:52:58.367319 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ec7bc5e-c876-4f51-8135-166f8ea45721" path="/var/lib/kubelet/pods/2ec7bc5e-c876-4f51-8135-166f8ea45721/volumes" Mar 19 10:52:58 crc kubenswrapper[4765]: I0319 10:52:58.368156 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed90710f-8437-4621-b01e-a78cb4f0a96c" path="/var/lib/kubelet/pods/ed90710f-8437-4621-b01e-a78cb4f0a96c/volumes" Mar 19 10:52:58 crc kubenswrapper[4765]: I0319 10:52:58.814481 4765 scope.go:117] "RemoveContainer" containerID="0daa0cf7d801aba08b6255a5d3f9d5ad1fb6f6a7bba78b1fc71d24cf3afa469e" Mar 19 10:52:58 crc kubenswrapper[4765]: I0319 10:52:58.874394 4765 scope.go:117] "RemoveContainer" containerID="f885113378798203ce2115b5b6acce36282db32d1ced42eb42999ef4341c8c4a" Mar 19 10:52:58 crc kubenswrapper[4765]: I0319 10:52:58.917743 4765 scope.go:117] "RemoveContainer" containerID="6db17d80b8214aed4015b5b0aaab9ec8ac9c4c385259ead38f242a8e12bd869a" Mar 19 10:52:58 crc kubenswrapper[4765]: I0319 10:52:58.964441 4765 scope.go:117] "RemoveContainer" containerID="fe6d09544f88b41b3e7c804e567d48346b0bf23b5eaceb4e8474f71a9b463a43" Mar 19 10:52:59 crc kubenswrapper[4765]: I0319 10:52:59.012056 4765 scope.go:117] "RemoveContainer" containerID="a752eed62ec4bf77939b90fc56b5260c1b4f2eac5088987b4dba2288b1f30345" Mar 19 10:52:59 crc kubenswrapper[4765]: I0319 10:52:59.072280 4765 scope.go:117] "RemoveContainer" containerID="6ee19f54a1109bb1d46de72704b26d9f8cb61d2bf04b661e24ff583f45d89752" Mar 19 10:52:59 crc kubenswrapper[4765]: I0319 10:52:59.127227 4765 scope.go:117] "RemoveContainer" containerID="a1ee19cb4fc979afebeb07647143bf1626b8b2049c582ded46bc58c26cd82cca" Mar 19 10:52:59 crc kubenswrapper[4765]: I0319 10:52:59.156195 4765 scope.go:117] "RemoveContainer" containerID="b961b6ba6a0a5ead3cf0f18642f7b2af581727fa5ab5824f71ffa9873c8280fa" Mar 19 10:52:59 crc kubenswrapper[4765]: I0319 10:52:59.189700 4765 scope.go:117] "RemoveContainer" containerID="d522436d60bdd5fc0de249b55c07a37b11ad229d3cddb94b681dc057322aab8a" Mar 19 10:52:59 crc kubenswrapper[4765]: I0319 10:52:59.237565 4765 scope.go:117] "RemoveContainer" containerID="7521b214959d7571a4688a00f1f6cdd063309f03e9c54ad589a21fd2be254f9b" Mar 19 10:53:06 crc kubenswrapper[4765]: I0319 10:53:06.356439 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:53:06 crc kubenswrapper[4765]: E0319 10:53:06.357268 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:53:07 crc kubenswrapper[4765]: I0319 10:53:07.032865 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/barbican-db-sync-cczjt"] Mar 19 10:53:07 crc kubenswrapper[4765]: I0319 10:53:07.042753 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/barbican-db-sync-cczjt"] Mar 19 10:53:08 crc kubenswrapper[4765]: I0319 10:53:08.370377 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4012221-7e3d-4ee8-9c90-e564931f5a30" path="/var/lib/kubelet/pods/e4012221-7e3d-4ee8-9c90-e564931f5a30/volumes" Mar 19 10:53:09 crc kubenswrapper[4765]: I0319 10:53:09.026243 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/neutron-db-sync-jjjh8"] Mar 19 10:53:09 crc kubenswrapper[4765]: I0319 10:53:09.034460 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/neutron-db-sync-jjjh8"] Mar 19 10:53:10 crc kubenswrapper[4765]: I0319 10:53:10.367067 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e" path="/var/lib/kubelet/pods/fb34ed2a-1bbe-4ee9-9c00-7c1c340a868e/volumes" Mar 19 10:53:17 crc kubenswrapper[4765]: I0319 10:53:17.034889 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/cinder-db-sync-mbhfs"] Mar 19 10:53:17 crc kubenswrapper[4765]: I0319 10:53:17.044185 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/cinder-db-sync-mbhfs"] Mar 19 10:53:17 crc kubenswrapper[4765]: I0319 10:53:17.356819 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:53:17 crc kubenswrapper[4765]: E0319 10:53:17.357183 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:53:18 crc kubenswrapper[4765]: I0319 10:53:18.365585 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="21c84c7a-03f0-4ab5-a259-95a351cbdf13" path="/var/lib/kubelet/pods/21c84c7a-03f0-4ab5-a259-95a351cbdf13/volumes" Mar 19 10:53:32 crc kubenswrapper[4765]: I0319 10:53:32.363948 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:53:32 crc kubenswrapper[4765]: E0319 10:53:32.365017 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:53:46 crc kubenswrapper[4765]: I0319 10:53:46.357072 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:53:46 crc kubenswrapper[4765]: E0319 10:53:46.358196 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.054556 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-db-create-bgz4n"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.065601 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-e031-account-create-update-smc8t"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.076088 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-db-create-bg8xf"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.097250 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-31ef-account-create-update-kcvq5"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.109179 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-api-4a7c-account-create-update-tbxvw"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.135527 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-db-create-bgz4n"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.149587 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-e031-account-create-update-smc8t"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.160069 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-db-create-bg8xf"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.167769 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-db-create-7fsgp"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.176302 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-api-4a7c-account-create-update-tbxvw"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.183864 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-31ef-account-create-update-kcvq5"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.191281 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-db-create-7fsgp"] Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.614900 4765 generic.go:334] "Generic (PLEG): container finished" podID="fd32b580-78f7-478e-ba1d-9d1a86b75f3a" containerID="99bbe0da6b2fce87ecb7a804769f7b38c75b30dfdb9e448be8dd7dcef611021e" exitCode=0 Mar 19 10:53:53 crc kubenswrapper[4765]: I0319 10:53:53.614980 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" event={"ID":"fd32b580-78f7-478e-ba1d-9d1a86b75f3a","Type":"ContainerDied","Data":"99bbe0da6b2fce87ecb7a804769f7b38c75b30dfdb9e448be8dd7dcef611021e"} Mar 19 10:53:54 crc kubenswrapper[4765]: I0319 10:53:54.371651 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1cb1943e-d6aa-4223-8654-5e674a71b734" path="/var/lib/kubelet/pods/1cb1943e-d6aa-4223-8654-5e674a71b734/volumes" Mar 19 10:53:54 crc kubenswrapper[4765]: I0319 10:53:54.373821 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5de4629f-4496-4991-962f-4410df18a713" path="/var/lib/kubelet/pods/5de4629f-4496-4991-962f-4410df18a713/volumes" Mar 19 10:53:54 crc kubenswrapper[4765]: I0319 10:53:54.374650 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="63fab428-8477-40cd-bd57-250471e0d108" path="/var/lib/kubelet/pods/63fab428-8477-40cd-bd57-250471e0d108/volumes" Mar 19 10:53:54 crc kubenswrapper[4765]: I0319 10:53:54.375381 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="97833765-fe7a-40eb-9764-180d2123e113" path="/var/lib/kubelet/pods/97833765-fe7a-40eb-9764-180d2123e113/volumes" Mar 19 10:53:54 crc kubenswrapper[4765]: I0319 10:53:54.377163 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0821b55-a4e1-4b0f-af18-513aefaa8d9e" path="/var/lib/kubelet/pods/d0821b55-a4e1-4b0f-af18-513aefaa8d9e/volumes" Mar 19 10:53:54 crc kubenswrapper[4765]: I0319 10:53:54.378791 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de70af7a-9885-40d1-868d-14c156308212" path="/var/lib/kubelet/pods/de70af7a-9885-40d1-868d-14c156308212/volumes" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.025550 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.117115 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4twkx\" (UniqueName: \"kubernetes.io/projected/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-kube-api-access-4twkx\") pod \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.117240 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-ssh-key-openstack-edpm-ipam\") pod \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.117293 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-inventory\") pod \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\" (UID: \"fd32b580-78f7-478e-ba1d-9d1a86b75f3a\") " Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.124700 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-kube-api-access-4twkx" (OuterVolumeSpecName: "kube-api-access-4twkx") pod "fd32b580-78f7-478e-ba1d-9d1a86b75f3a" (UID: "fd32b580-78f7-478e-ba1d-9d1a86b75f3a"). InnerVolumeSpecName "kube-api-access-4twkx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.147475 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "fd32b580-78f7-478e-ba1d-9d1a86b75f3a" (UID: "fd32b580-78f7-478e-ba1d-9d1a86b75f3a"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.148201 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-inventory" (OuterVolumeSpecName: "inventory") pod "fd32b580-78f7-478e-ba1d-9d1a86b75f3a" (UID: "fd32b580-78f7-478e-ba1d-9d1a86b75f3a"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.219659 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4twkx\" (UniqueName: \"kubernetes.io/projected/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-kube-api-access-4twkx\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.219697 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.219710 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/fd32b580-78f7-478e-ba1d-9d1a86b75f3a-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.632942 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" event={"ID":"fd32b580-78f7-478e-ba1d-9d1a86b75f3a","Type":"ContainerDied","Data":"c62304cbac352a655322eb030bd8e0f75615be500f7d1c79ec3ca9520a3bdcfd"} Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.633022 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c62304cbac352a655322eb030bd8e0f75615be500f7d1c79ec3ca9520a3bdcfd" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.633020 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.739997 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx"] Mar 19 10:53:55 crc kubenswrapper[4765]: E0319 10:53:55.740470 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd32b580-78f7-478e-ba1d-9d1a86b75f3a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.740501 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd32b580-78f7-478e-ba1d-9d1a86b75f3a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.740737 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd32b580-78f7-478e-ba1d-9d1a86b75f3a" containerName="configure-network-edpm-deployment-openstack-edpm-ipam" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.741504 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.744016 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.744172 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.744314 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.744488 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.751743 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx"] Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.830286 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.830486 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsr8q\" (UniqueName: \"kubernetes.io/projected/3a573215-571a-49dc-9903-82134a77d196-kube-api-access-lsr8q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.830561 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.932914 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.933203 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lsr8q\" (UniqueName: \"kubernetes.io/projected/3a573215-571a-49dc-9903-82134a77d196-kube-api-access-lsr8q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.933312 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.938185 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-ssh-key-openstack-edpm-ipam\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.943875 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-inventory\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:55 crc kubenswrapper[4765]: I0319 10:53:55.963606 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsr8q\" (UniqueName: \"kubernetes.io/projected/3a573215-571a-49dc-9903-82134a77d196-kube-api-access-lsr8q\") pod \"validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:56 crc kubenswrapper[4765]: I0319 10:53:56.059012 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:53:56 crc kubenswrapper[4765]: I0319 10:53:56.581846 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx"] Mar 19 10:53:56 crc kubenswrapper[4765]: I0319 10:53:56.645308 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" event={"ID":"3a573215-571a-49dc-9903-82134a77d196","Type":"ContainerStarted","Data":"359c2436d4cef00a045806668376238ae3b37141962dbcee50dd88406dd73998"} Mar 19 10:53:58 crc kubenswrapper[4765]: I0319 10:53:58.664642 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" event={"ID":"3a573215-571a-49dc-9903-82134a77d196","Type":"ContainerStarted","Data":"4fed843b61dffef25cfa6026d3519080fa72b3d0bfd1b7b85c280324e115caa6"} Mar 19 10:53:58 crc kubenswrapper[4765]: I0319 10:53:58.693585 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" podStartSLOduration=2.69697109 podStartE2EDuration="3.693560085s" podCreationTimestamp="2026-03-19 10:53:55 +0000 UTC" firstStartedPulling="2026-03-19 10:53:56.596877623 +0000 UTC m=+1934.945823155" lastFinishedPulling="2026-03-19 10:53:57.593466588 +0000 UTC m=+1935.942412150" observedRunningTime="2026-03-19 10:53:58.692012203 +0000 UTC m=+1937.040957735" watchObservedRunningTime="2026-03-19 10:53:58.693560085 +0000 UTC m=+1937.042505627" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.451261 4765 scope.go:117] "RemoveContainer" containerID="a0e7bd5f7d8c30648a07ded5e04182fb4b1b36a99584c14b6806d67eba09527b" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.514060 4765 scope.go:117] "RemoveContainer" containerID="213393381c9495cf24e47a93ec6c2fbe35ee4f3848e3c01fedb9b59b7b70cdb0" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.537950 4765 scope.go:117] "RemoveContainer" containerID="eb368f1410dc53f6a7bcd7b26fa29ac3838dee4fc9d92e44e8e090d0b483cfce" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.613216 4765 scope.go:117] "RemoveContainer" containerID="594824d6c73c3246a8884eeefdc9303559e3ecafa98a05f9f83de087157e8f7c" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.676440 4765 scope.go:117] "RemoveContainer" containerID="e196bf8e817931b65767a3792002f17daefaf7f72feb2155ee556f929c4fce0b" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.703170 4765 scope.go:117] "RemoveContainer" containerID="b7e33e2b91b09b641878bcc075c4d7b5fb8eb68bd0db94f878cf3b9382d50b4b" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.765403 4765 scope.go:117] "RemoveContainer" containerID="edf040c33203f5d50ab2e2972078f229e81099369bb6ab8599a8caa00aec6236" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.790922 4765 scope.go:117] "RemoveContainer" containerID="0d09b81495bf5da85632e84b0aaea7be01f3b5e425bd8111c3eaf7ead482fd6f" Mar 19 10:53:59 crc kubenswrapper[4765]: I0319 10:53:59.811993 4765 scope.go:117] "RemoveContainer" containerID="1c943e9ec8e4b88b50621f0140f2ff452c402bcd733d403b06e877382ef1c5e2" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.141144 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565294-ggg2c"] Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.143430 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-ggg2c" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.147100 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.147414 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.149653 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.160896 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-ggg2c"] Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.251352 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/d2bbbd3f-dbde-41f4-ad9a-9186d51640e1-kube-api-access-qwflc\") pod \"auto-csr-approver-29565294-ggg2c\" (UID: \"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1\") " pod="openshift-infra/auto-csr-approver-29565294-ggg2c" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.353280 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/d2bbbd3f-dbde-41f4-ad9a-9186d51640e1-kube-api-access-qwflc\") pod \"auto-csr-approver-29565294-ggg2c\" (UID: \"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1\") " pod="openshift-infra/auto-csr-approver-29565294-ggg2c" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.376328 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/d2bbbd3f-dbde-41f4-ad9a-9186d51640e1-kube-api-access-qwflc\") pod \"auto-csr-approver-29565294-ggg2c\" (UID: \"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1\") " pod="openshift-infra/auto-csr-approver-29565294-ggg2c" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.482825 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-ggg2c" Mar 19 10:54:00 crc kubenswrapper[4765]: I0319 10:54:00.961160 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-ggg2c"] Mar 19 10:54:01 crc kubenswrapper[4765]: I0319 10:54:01.356465 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:54:01 crc kubenswrapper[4765]: E0319 10:54:01.357375 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:54:01 crc kubenswrapper[4765]: I0319 10:54:01.711368 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565294-ggg2c" event={"ID":"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1","Type":"ContainerStarted","Data":"34ceac4c89a78404ca46b0bfac8314fd4537d5cac96dc14254eb8df9889af1fb"} Mar 19 10:54:02 crc kubenswrapper[4765]: I0319 10:54:02.723239 4765 generic.go:334] "Generic (PLEG): container finished" podID="3a573215-571a-49dc-9903-82134a77d196" containerID="4fed843b61dffef25cfa6026d3519080fa72b3d0bfd1b7b85c280324e115caa6" exitCode=0 Mar 19 10:54:02 crc kubenswrapper[4765]: I0319 10:54:02.723312 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" event={"ID":"3a573215-571a-49dc-9903-82134a77d196","Type":"ContainerDied","Data":"4fed843b61dffef25cfa6026d3519080fa72b3d0bfd1b7b85c280324e115caa6"} Mar 19 10:54:02 crc kubenswrapper[4765]: I0319 10:54:02.727479 4765 generic.go:334] "Generic (PLEG): container finished" podID="d2bbbd3f-dbde-41f4-ad9a-9186d51640e1" containerID="ea782b2e67066acf7cb5bd5dc88e3bc505ce123975f183eb849fc31610a22820" exitCode=0 Mar 19 10:54:02 crc kubenswrapper[4765]: I0319 10:54:02.727557 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565294-ggg2c" event={"ID":"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1","Type":"ContainerDied","Data":"ea782b2e67066acf7cb5bd5dc88e3bc505ce123975f183eb849fc31610a22820"} Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.171151 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-ggg2c" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.249504 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/d2bbbd3f-dbde-41f4-ad9a-9186d51640e1-kube-api-access-qwflc\") pod \"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1\" (UID: \"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1\") " Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.261588 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d2bbbd3f-dbde-41f4-ad9a-9186d51640e1-kube-api-access-qwflc" (OuterVolumeSpecName: "kube-api-access-qwflc") pod "d2bbbd3f-dbde-41f4-ad9a-9186d51640e1" (UID: "d2bbbd3f-dbde-41f4-ad9a-9186d51640e1"). InnerVolumeSpecName "kube-api-access-qwflc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.317812 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.352452 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-inventory\") pod \"3a573215-571a-49dc-9903-82134a77d196\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.352540 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-ssh-key-openstack-edpm-ipam\") pod \"3a573215-571a-49dc-9903-82134a77d196\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.352595 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsr8q\" (UniqueName: \"kubernetes.io/projected/3a573215-571a-49dc-9903-82134a77d196-kube-api-access-lsr8q\") pod \"3a573215-571a-49dc-9903-82134a77d196\" (UID: \"3a573215-571a-49dc-9903-82134a77d196\") " Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.353697 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qwflc\" (UniqueName: \"kubernetes.io/projected/d2bbbd3f-dbde-41f4-ad9a-9186d51640e1-kube-api-access-qwflc\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.358107 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a573215-571a-49dc-9903-82134a77d196-kube-api-access-lsr8q" (OuterVolumeSpecName: "kube-api-access-lsr8q") pod "3a573215-571a-49dc-9903-82134a77d196" (UID: "3a573215-571a-49dc-9903-82134a77d196"). InnerVolumeSpecName "kube-api-access-lsr8q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.388160 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-inventory" (OuterVolumeSpecName: "inventory") pod "3a573215-571a-49dc-9903-82134a77d196" (UID: "3a573215-571a-49dc-9903-82134a77d196"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.391703 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "3a573215-571a-49dc-9903-82134a77d196" (UID: "3a573215-571a-49dc-9903-82134a77d196"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.455817 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.455868 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/3a573215-571a-49dc-9903-82134a77d196-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.455879 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lsr8q\" (UniqueName: \"kubernetes.io/projected/3a573215-571a-49dc-9903-82134a77d196-kube-api-access-lsr8q\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.746814 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565294-ggg2c" event={"ID":"d2bbbd3f-dbde-41f4-ad9a-9186d51640e1","Type":"ContainerDied","Data":"34ceac4c89a78404ca46b0bfac8314fd4537d5cac96dc14254eb8df9889af1fb"} Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.746849 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="34ceac4c89a78404ca46b0bfac8314fd4537d5cac96dc14254eb8df9889af1fb" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.746851 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565294-ggg2c" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.748258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" event={"ID":"3a573215-571a-49dc-9903-82134a77d196","Type":"ContainerDied","Data":"359c2436d4cef00a045806668376238ae3b37141962dbcee50dd88406dd73998"} Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.748278 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="359c2436d4cef00a045806668376238ae3b37141962dbcee50dd88406dd73998" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.748284 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.818581 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg"] Mar 19 10:54:04 crc kubenswrapper[4765]: E0319 10:54:04.819089 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d2bbbd3f-dbde-41f4-ad9a-9186d51640e1" containerName="oc" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.819115 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d2bbbd3f-dbde-41f4-ad9a-9186d51640e1" containerName="oc" Mar 19 10:54:04 crc kubenswrapper[4765]: E0319 10:54:04.819131 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a573215-571a-49dc-9903-82134a77d196" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.819139 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a573215-571a-49dc-9903-82134a77d196" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.819340 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d2bbbd3f-dbde-41f4-ad9a-9186d51640e1" containerName="oc" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.819370 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a573215-571a-49dc-9903-82134a77d196" containerName="validate-network-edpm-deployment-openstack-edpm-ipam" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.819989 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.822233 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.823645 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.824062 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.824414 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.835328 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg"] Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.863921 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.864127 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.864176 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4m66\" (UniqueName: \"kubernetes.io/projected/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-kube-api-access-f4m66\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.966037 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.966246 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.966295 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4m66\" (UniqueName: \"kubernetes.io/projected/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-kube-api-access-f4m66\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.970704 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-inventory\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.970704 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-ssh-key-openstack-edpm-ipam\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:04 crc kubenswrapper[4765]: I0319 10:54:04.986452 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4m66\" (UniqueName: \"kubernetes.io/projected/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-kube-api-access-f4m66\") pod \"install-os-edpm-deployment-openstack-edpm-ipam-4x5vg\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:05 crc kubenswrapper[4765]: I0319 10:54:05.143118 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:05 crc kubenswrapper[4765]: I0319 10:54:05.255415 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-bmzpl"] Mar 19 10:54:05 crc kubenswrapper[4765]: I0319 10:54:05.273389 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565288-bmzpl"] Mar 19 10:54:05 crc kubenswrapper[4765]: I0319 10:54:05.751988 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg"] Mar 19 10:54:06 crc kubenswrapper[4765]: I0319 10:54:06.370422 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2cbc4a7d-5dd1-4318-95eb-4e2047bfff19" path="/var/lib/kubelet/pods/2cbc4a7d-5dd1-4318-95eb-4e2047bfff19/volumes" Mar 19 10:54:06 crc kubenswrapper[4765]: I0319 10:54:06.768142 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" event={"ID":"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8","Type":"ContainerStarted","Data":"334271ee3a9b5780ee495d41a0d14dea61166e1aa58ff6dc2084a094e356e3c1"} Mar 19 10:54:07 crc kubenswrapper[4765]: I0319 10:54:07.777205 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" event={"ID":"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8","Type":"ContainerStarted","Data":"b9f9cb3196c7fbe8cea30a0f85273ea7929b9d0b3a4a59072e7c5e3a5b65d2e7"} Mar 19 10:54:07 crc kubenswrapper[4765]: I0319 10:54:07.805747 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" podStartSLOduration=3.0544957679999998 podStartE2EDuration="3.805723267s" podCreationTimestamp="2026-03-19 10:54:04 +0000 UTC" firstStartedPulling="2026-03-19 10:54:05.756261582 +0000 UTC m=+1944.105207124" lastFinishedPulling="2026-03-19 10:54:06.507489081 +0000 UTC m=+1944.856434623" observedRunningTime="2026-03-19 10:54:07.794476252 +0000 UTC m=+1946.143421834" watchObservedRunningTime="2026-03-19 10:54:07.805723267 +0000 UTC m=+1946.154668809" Mar 19 10:54:14 crc kubenswrapper[4765]: I0319 10:54:14.360122 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:54:14 crc kubenswrapper[4765]: E0319 10:54:14.365641 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:54:25 crc kubenswrapper[4765]: I0319 10:54:25.356086 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:54:25 crc kubenswrapper[4765]: E0319 10:54:25.356777 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:54:36 crc kubenswrapper[4765]: I0319 10:54:36.356654 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:54:36 crc kubenswrapper[4765]: E0319 10:54:36.357655 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:54:41 crc kubenswrapper[4765]: I0319 10:54:41.069708 4765 generic.go:334] "Generic (PLEG): container finished" podID="c09e1efc-02d2-4e0f-9e16-36d9627e0fb8" containerID="b9f9cb3196c7fbe8cea30a0f85273ea7929b9d0b3a4a59072e7c5e3a5b65d2e7" exitCode=0 Mar 19 10:54:41 crc kubenswrapper[4765]: I0319 10:54:41.069839 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" event={"ID":"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8","Type":"ContainerDied","Data":"b9f9cb3196c7fbe8cea30a0f85273ea7929b9d0b3a4a59072e7c5e3a5b65d2e7"} Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.463486 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.539239 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-ssh-key-openstack-edpm-ipam\") pod \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.539453 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4m66\" (UniqueName: \"kubernetes.io/projected/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-kube-api-access-f4m66\") pod \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.539641 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-inventory\") pod \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\" (UID: \"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8\") " Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.544861 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-kube-api-access-f4m66" (OuterVolumeSpecName: "kube-api-access-f4m66") pod "c09e1efc-02d2-4e0f-9e16-36d9627e0fb8" (UID: "c09e1efc-02d2-4e0f-9e16-36d9627e0fb8"). InnerVolumeSpecName "kube-api-access-f4m66". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.566152 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "c09e1efc-02d2-4e0f-9e16-36d9627e0fb8" (UID: "c09e1efc-02d2-4e0f-9e16-36d9627e0fb8"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.568396 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-inventory" (OuterVolumeSpecName: "inventory") pod "c09e1efc-02d2-4e0f-9e16-36d9627e0fb8" (UID: "c09e1efc-02d2-4e0f-9e16-36d9627e0fb8"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.642066 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.642099 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:42 crc kubenswrapper[4765]: I0319 10:54:42.642110 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4m66\" (UniqueName: \"kubernetes.io/projected/c09e1efc-02d2-4e0f-9e16-36d9627e0fb8-kube-api-access-f4m66\") on node \"crc\" DevicePath \"\"" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.087630 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" event={"ID":"c09e1efc-02d2-4e0f-9e16-36d9627e0fb8","Type":"ContainerDied","Data":"334271ee3a9b5780ee495d41a0d14dea61166e1aa58ff6dc2084a094e356e3c1"} Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.087696 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="334271ee3a9b5780ee495d41a0d14dea61166e1aa58ff6dc2084a094e356e3c1" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.087754 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-os-edpm-deployment-openstack-edpm-ipam-4x5vg" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.196236 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g"] Mar 19 10:54:43 crc kubenswrapper[4765]: E0319 10:54:43.196835 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c09e1efc-02d2-4e0f-9e16-36d9627e0fb8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.196862 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c09e1efc-02d2-4e0f-9e16-36d9627e0fb8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.197111 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c09e1efc-02d2-4e0f-9e16-36d9627e0fb8" containerName="install-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.197871 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.200245 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.200288 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.200777 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.201067 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.207582 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g"] Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.273103 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.273248 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6klk\" (UniqueName: \"kubernetes.io/projected/546dffbe-3a15-4074-a5be-deac4d1530e3-kube-api-access-x6klk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.273357 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.375871 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.376051 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.376235 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6klk\" (UniqueName: \"kubernetes.io/projected/546dffbe-3a15-4074-a5be-deac4d1530e3-kube-api-access-x6klk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.381912 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-inventory\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.389051 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-ssh-key-openstack-edpm-ipam\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.392683 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6klk\" (UniqueName: \"kubernetes.io/projected/546dffbe-3a15-4074-a5be-deac4d1530e3-kube-api-access-x6klk\") pod \"configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:43 crc kubenswrapper[4765]: I0319 10:54:43.515990 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:54:44 crc kubenswrapper[4765]: I0319 10:54:44.042493 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g"] Mar 19 10:54:44 crc kubenswrapper[4765]: I0319 10:54:44.095237 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" event={"ID":"546dffbe-3a15-4074-a5be-deac4d1530e3","Type":"ContainerStarted","Data":"fd10dcdb59957f4257b750de2773fa27b84d9e2e38e688290882b7cecfc06034"} Mar 19 10:54:45 crc kubenswrapper[4765]: I0319 10:54:45.105295 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" event={"ID":"546dffbe-3a15-4074-a5be-deac4d1530e3","Type":"ContainerStarted","Data":"5e25550d1b86f01bbd0c6814ae07df4bb62de52e6a3c0f81654000f413f3180f"} Mar 19 10:54:45 crc kubenswrapper[4765]: I0319 10:54:45.128974 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" podStartSLOduration=1.441798493 podStartE2EDuration="2.128937576s" podCreationTimestamp="2026-03-19 10:54:43 +0000 UTC" firstStartedPulling="2026-03-19 10:54:44.041443679 +0000 UTC m=+1982.390389221" lastFinishedPulling="2026-03-19 10:54:44.728582742 +0000 UTC m=+1983.077528304" observedRunningTime="2026-03-19 10:54:45.119245233 +0000 UTC m=+1983.468190775" watchObservedRunningTime="2026-03-19 10:54:45.128937576 +0000 UTC m=+1983.477883118" Mar 19 10:54:47 crc kubenswrapper[4765]: I0319 10:54:47.356287 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:54:47 crc kubenswrapper[4765]: E0319 10:54:47.357857 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:54:52 crc kubenswrapper[4765]: I0319 10:54:52.041114 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvf9r"] Mar 19 10:54:52 crc kubenswrapper[4765]: I0319 10:54:52.050663 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-conductor-db-sync-rvf9r"] Mar 19 10:54:52 crc kubenswrapper[4765]: I0319 10:54:52.366500 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0411109-7b7f-4013-baef-8970df3e2dbf" path="/var/lib/kubelet/pods/d0411109-7b7f-4013-baef-8970df3e2dbf/volumes" Mar 19 10:54:59 crc kubenswrapper[4765]: I0319 10:54:59.357168 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:54:59 crc kubenswrapper[4765]: E0319 10:54:59.357951 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 10:55:00 crc kubenswrapper[4765]: I0319 10:55:00.004794 4765 scope.go:117] "RemoveContainer" containerID="8f69d0cf3afd3b05c0afe19bba5388841ccaf130da23fbcd1abebed1314ca46b" Mar 19 10:55:00 crc kubenswrapper[4765]: I0319 10:55:00.049859 4765 scope.go:117] "RemoveContainer" containerID="f7d4a6c1acc5018cdd82df547db6d4304cc5a336afb590dbb8e98e36da2cb4f7" Mar 19 10:55:11 crc kubenswrapper[4765]: I0319 10:55:11.042578 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fm985"] Mar 19 10:55:11 crc kubenswrapper[4765]: I0319 10:55:11.050653 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-conductor-db-sync-fm985"] Mar 19 10:55:11 crc kubenswrapper[4765]: I0319 10:55:11.356795 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:55:12 crc kubenswrapper[4765]: I0319 10:55:12.063719 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell0-cell-mapping-7rjkg"] Mar 19 10:55:12 crc kubenswrapper[4765]: I0319 10:55:12.076692 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell0-cell-mapping-7rjkg"] Mar 19 10:55:12 crc kubenswrapper[4765]: I0319 10:55:12.325101 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"ab93876b58cf4a5eb3ff787b278638a8ff1606d52b12f8abdbd4265ceb51f06d"} Mar 19 10:55:12 crc kubenswrapper[4765]: I0319 10:55:12.366713 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="313b3021-c103-47ac-9cb5-b38e971d22fd" path="/var/lib/kubelet/pods/313b3021-c103-47ac-9cb5-b38e971d22fd/volumes" Mar 19 10:55:12 crc kubenswrapper[4765]: I0319 10:55:12.367337 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7" path="/var/lib/kubelet/pods/ff89b19a-24e7-4a8c-abd4-cfc23a17d7c7/volumes" Mar 19 10:55:25 crc kubenswrapper[4765]: I0319 10:55:25.433252 4765 generic.go:334] "Generic (PLEG): container finished" podID="546dffbe-3a15-4074-a5be-deac4d1530e3" containerID="5e25550d1b86f01bbd0c6814ae07df4bb62de52e6a3c0f81654000f413f3180f" exitCode=0 Mar 19 10:55:25 crc kubenswrapper[4765]: I0319 10:55:25.433341 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" event={"ID":"546dffbe-3a15-4074-a5be-deac4d1530e3","Type":"ContainerDied","Data":"5e25550d1b86f01bbd0c6814ae07df4bb62de52e6a3c0f81654000f413f3180f"} Mar 19 10:55:26 crc kubenswrapper[4765]: I0319 10:55:26.811662 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:55:26 crc kubenswrapper[4765]: I0319 10:55:26.987255 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6klk\" (UniqueName: \"kubernetes.io/projected/546dffbe-3a15-4074-a5be-deac4d1530e3-kube-api-access-x6klk\") pod \"546dffbe-3a15-4074-a5be-deac4d1530e3\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " Mar 19 10:55:26 crc kubenswrapper[4765]: I0319 10:55:26.987333 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-inventory\") pod \"546dffbe-3a15-4074-a5be-deac4d1530e3\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " Mar 19 10:55:26 crc kubenswrapper[4765]: I0319 10:55:26.987369 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-ssh-key-openstack-edpm-ipam\") pod \"546dffbe-3a15-4074-a5be-deac4d1530e3\" (UID: \"546dffbe-3a15-4074-a5be-deac4d1530e3\") " Mar 19 10:55:26 crc kubenswrapper[4765]: I0319 10:55:26.992538 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546dffbe-3a15-4074-a5be-deac4d1530e3-kube-api-access-x6klk" (OuterVolumeSpecName: "kube-api-access-x6klk") pod "546dffbe-3a15-4074-a5be-deac4d1530e3" (UID: "546dffbe-3a15-4074-a5be-deac4d1530e3"). InnerVolumeSpecName "kube-api-access-x6klk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.012888 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-inventory" (OuterVolumeSpecName: "inventory") pod "546dffbe-3a15-4074-a5be-deac4d1530e3" (UID: "546dffbe-3a15-4074-a5be-deac4d1530e3"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.016349 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "546dffbe-3a15-4074-a5be-deac4d1530e3" (UID: "546dffbe-3a15-4074-a5be-deac4d1530e3"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.090359 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6klk\" (UniqueName: \"kubernetes.io/projected/546dffbe-3a15-4074-a5be-deac4d1530e3-kube-api-access-x6klk\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.090407 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.090422 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/546dffbe-3a15-4074-a5be-deac4d1530e3-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.452119 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" event={"ID":"546dffbe-3a15-4074-a5be-deac4d1530e3","Type":"ContainerDied","Data":"fd10dcdb59957f4257b750de2773fa27b84d9e2e38e688290882b7cecfc06034"} Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.452292 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fd10dcdb59957f4257b750de2773fa27b84d9e2e38e688290882b7cecfc06034" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.452411 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.540526 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gc6dd"] Mar 19 10:55:27 crc kubenswrapper[4765]: E0319 10:55:27.541033 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546dffbe-3a15-4074-a5be-deac4d1530e3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.541056 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="546dffbe-3a15-4074-a5be-deac4d1530e3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.541346 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="546dffbe-3a15-4074-a5be-deac4d1530e3" containerName="configure-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.542123 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.545697 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.546058 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.546274 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.546489 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.554046 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gc6dd"] Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.702555 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.702618 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nn68s\" (UniqueName: \"kubernetes.io/projected/f46a25a2-f362-487c-9511-b9888a18b08e-kube-api-access-nn68s\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.702660 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.804613 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.804724 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nn68s\" (UniqueName: \"kubernetes.io/projected/f46a25a2-f362-487c-9511-b9888a18b08e-kube-api-access-nn68s\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.805247 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.812268 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-ssh-key-openstack-edpm-ipam\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.812563 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-inventory-0\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.825446 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nn68s\" (UniqueName: \"kubernetes.io/projected/f46a25a2-f362-487c-9511-b9888a18b08e-kube-api-access-nn68s\") pod \"ssh-known-hosts-edpm-deployment-gc6dd\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:27 crc kubenswrapper[4765]: I0319 10:55:27.867095 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.174027 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g58gl"] Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.176305 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.187833 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g58gl"] Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.315791 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-catalog-content\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.315854 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-utilities\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.315880 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dp84g\" (UniqueName: \"kubernetes.io/projected/b2941c2c-827b-4461-8bf0-1b2d75feafc2-kube-api-access-dp84g\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.376924 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ssh-known-hosts-edpm-deployment-gc6dd"] Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.417599 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-catalog-content\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.417918 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-utilities\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.417947 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dp84g\" (UniqueName: \"kubernetes.io/projected/b2941c2c-827b-4461-8bf0-1b2d75feafc2-kube-api-access-dp84g\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.418237 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-catalog-content\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.418319 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-utilities\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.438500 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dp84g\" (UniqueName: \"kubernetes.io/projected/b2941c2c-827b-4461-8bf0-1b2d75feafc2-kube-api-access-dp84g\") pod \"community-operators-g58gl\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.461791 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" event={"ID":"f46a25a2-f362-487c-9511-b9888a18b08e","Type":"ContainerStarted","Data":"c5d7d35c651131233d793281304f22e22b5c5659409c6020d8c22d296942cd55"} Mar 19 10:55:28 crc kubenswrapper[4765]: I0319 10:55:28.498118 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:29 crc kubenswrapper[4765]: I0319 10:55:29.011915 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g58gl"] Mar 19 10:55:29 crc kubenswrapper[4765]: I0319 10:55:29.471131 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" event={"ID":"f46a25a2-f362-487c-9511-b9888a18b08e","Type":"ContainerStarted","Data":"5317e784a65e2e12a0db5ce7503b7244c601c7d266484cff73ae9053eeff87fe"} Mar 19 10:55:29 crc kubenswrapper[4765]: I0319 10:55:29.472710 4765 generic.go:334] "Generic (PLEG): container finished" podID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerID="67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e" exitCode=0 Mar 19 10:55:29 crc kubenswrapper[4765]: I0319 10:55:29.472773 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g58gl" event={"ID":"b2941c2c-827b-4461-8bf0-1b2d75feafc2","Type":"ContainerDied","Data":"67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e"} Mar 19 10:55:29 crc kubenswrapper[4765]: I0319 10:55:29.473085 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g58gl" event={"ID":"b2941c2c-827b-4461-8bf0-1b2d75feafc2","Type":"ContainerStarted","Data":"0fd08d355ef4620ccc2e6d3ad98fc897e916f4604462baf65d4a91f52cc430e3"} Mar 19 10:55:29 crc kubenswrapper[4765]: I0319 10:55:29.493777 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" podStartSLOduration=1.763460432 podStartE2EDuration="2.493761224s" podCreationTimestamp="2026-03-19 10:55:27 +0000 UTC" firstStartedPulling="2026-03-19 10:55:28.392261977 +0000 UTC m=+2026.741207519" lastFinishedPulling="2026-03-19 10:55:29.122562769 +0000 UTC m=+2027.471508311" observedRunningTime="2026-03-19 10:55:29.48954179 +0000 UTC m=+2027.838487342" watchObservedRunningTime="2026-03-19 10:55:29.493761224 +0000 UTC m=+2027.842706766" Mar 19 10:55:30 crc kubenswrapper[4765]: I0319 10:55:30.484246 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g58gl" event={"ID":"b2941c2c-827b-4461-8bf0-1b2d75feafc2","Type":"ContainerStarted","Data":"2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0"} Mar 19 10:55:31 crc kubenswrapper[4765]: I0319 10:55:31.497934 4765 generic.go:334] "Generic (PLEG): container finished" podID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerID="2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0" exitCode=0 Mar 19 10:55:31 crc kubenswrapper[4765]: I0319 10:55:31.498104 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g58gl" event={"ID":"b2941c2c-827b-4461-8bf0-1b2d75feafc2","Type":"ContainerDied","Data":"2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0"} Mar 19 10:55:32 crc kubenswrapper[4765]: I0319 10:55:32.510356 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g58gl" event={"ID":"b2941c2c-827b-4461-8bf0-1b2d75feafc2","Type":"ContainerStarted","Data":"3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92"} Mar 19 10:55:32 crc kubenswrapper[4765]: I0319 10:55:32.536732 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g58gl" podStartSLOduration=1.912091566 podStartE2EDuration="4.536700919s" podCreationTimestamp="2026-03-19 10:55:28 +0000 UTC" firstStartedPulling="2026-03-19 10:55:29.474661837 +0000 UTC m=+2027.823607379" lastFinishedPulling="2026-03-19 10:55:32.09927119 +0000 UTC m=+2030.448216732" observedRunningTime="2026-03-19 10:55:32.532724372 +0000 UTC m=+2030.881669914" watchObservedRunningTime="2026-03-19 10:55:32.536700919 +0000 UTC m=+2030.885646461" Mar 19 10:55:32 crc kubenswrapper[4765]: I0319 10:55:32.979639 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-xlzdk"] Mar 19 10:55:32 crc kubenswrapper[4765]: I0319 10:55:32.984759 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:32 crc kubenswrapper[4765]: I0319 10:55:32.993909 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xlzdk"] Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.132646 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-catalog-content\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.134193 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-utilities\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.134550 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42zc7\" (UniqueName: \"kubernetes.io/projected/3b72f4e7-91e9-4dee-95c2-54d750064831-kube-api-access-42zc7\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.236523 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42zc7\" (UniqueName: \"kubernetes.io/projected/3b72f4e7-91e9-4dee-95c2-54d750064831-kube-api-access-42zc7\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.236654 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-catalog-content\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.236719 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-utilities\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.237513 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-utilities\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.237789 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-catalog-content\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.258488 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42zc7\" (UniqueName: \"kubernetes.io/projected/3b72f4e7-91e9-4dee-95c2-54d750064831-kube-api-access-42zc7\") pod \"certified-operators-xlzdk\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.341211 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:33 crc kubenswrapper[4765]: I0319 10:55:33.769482 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-xlzdk"] Mar 19 10:55:33 crc kubenswrapper[4765]: W0319 10:55:33.775060 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b72f4e7_91e9_4dee_95c2_54d750064831.slice/crio-a35ec999adb01c6a88d4181fe6864c8aa0f07ff98a09e147da798be8bf06d707 WatchSource:0}: Error finding container a35ec999adb01c6a88d4181fe6864c8aa0f07ff98a09e147da798be8bf06d707: Status 404 returned error can't find the container with id a35ec999adb01c6a88d4181fe6864c8aa0f07ff98a09e147da798be8bf06d707 Mar 19 10:55:34 crc kubenswrapper[4765]: I0319 10:55:34.554707 4765 generic.go:334] "Generic (PLEG): container finished" podID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerID="8edf1a3006aaaa72290a45f5e99ea172283e635200a91b4b048ee4224147f121" exitCode=0 Mar 19 10:55:34 crc kubenswrapper[4765]: I0319 10:55:34.554791 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlzdk" event={"ID":"3b72f4e7-91e9-4dee-95c2-54d750064831","Type":"ContainerDied","Data":"8edf1a3006aaaa72290a45f5e99ea172283e635200a91b4b048ee4224147f121"} Mar 19 10:55:34 crc kubenswrapper[4765]: I0319 10:55:34.555322 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlzdk" event={"ID":"3b72f4e7-91e9-4dee-95c2-54d750064831","Type":"ContainerStarted","Data":"a35ec999adb01c6a88d4181fe6864c8aa0f07ff98a09e147da798be8bf06d707"} Mar 19 10:55:36 crc kubenswrapper[4765]: I0319 10:55:36.574177 4765 generic.go:334] "Generic (PLEG): container finished" podID="f46a25a2-f362-487c-9511-b9888a18b08e" containerID="5317e784a65e2e12a0db5ce7503b7244c601c7d266484cff73ae9053eeff87fe" exitCode=0 Mar 19 10:55:36 crc kubenswrapper[4765]: I0319 10:55:36.574285 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" event={"ID":"f46a25a2-f362-487c-9511-b9888a18b08e","Type":"ContainerDied","Data":"5317e784a65e2e12a0db5ce7503b7244c601c7d266484cff73ae9053eeff87fe"} Mar 19 10:55:36 crc kubenswrapper[4765]: I0319 10:55:36.577926 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlzdk" event={"ID":"3b72f4e7-91e9-4dee-95c2-54d750064831","Type":"ContainerStarted","Data":"7864e42ab4248cf183b3a18caa4c66d188f9d898e1ab23e5b0c481709be56871"} Mar 19 10:55:37 crc kubenswrapper[4765]: I0319 10:55:37.587844 4765 generic.go:334] "Generic (PLEG): container finished" podID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerID="7864e42ab4248cf183b3a18caa4c66d188f9d898e1ab23e5b0c481709be56871" exitCode=0 Mar 19 10:55:37 crc kubenswrapper[4765]: I0319 10:55:37.587892 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlzdk" event={"ID":"3b72f4e7-91e9-4dee-95c2-54d750064831","Type":"ContainerDied","Data":"7864e42ab4248cf183b3a18caa4c66d188f9d898e1ab23e5b0c481709be56871"} Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.130685 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.241354 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-inventory-0\") pod \"f46a25a2-f362-487c-9511-b9888a18b08e\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.241513 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-ssh-key-openstack-edpm-ipam\") pod \"f46a25a2-f362-487c-9511-b9888a18b08e\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.241570 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nn68s\" (UniqueName: \"kubernetes.io/projected/f46a25a2-f362-487c-9511-b9888a18b08e-kube-api-access-nn68s\") pod \"f46a25a2-f362-487c-9511-b9888a18b08e\" (UID: \"f46a25a2-f362-487c-9511-b9888a18b08e\") " Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.248639 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f46a25a2-f362-487c-9511-b9888a18b08e-kube-api-access-nn68s" (OuterVolumeSpecName: "kube-api-access-nn68s") pod "f46a25a2-f362-487c-9511-b9888a18b08e" (UID: "f46a25a2-f362-487c-9511-b9888a18b08e"). InnerVolumeSpecName "kube-api-access-nn68s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.273476 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f46a25a2-f362-487c-9511-b9888a18b08e" (UID: "f46a25a2-f362-487c-9511-b9888a18b08e"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.276151 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-inventory-0" (OuterVolumeSpecName: "inventory-0") pod "f46a25a2-f362-487c-9511-b9888a18b08e" (UID: "f46a25a2-f362-487c-9511-b9888a18b08e"). InnerVolumeSpecName "inventory-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.344172 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.344354 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nn68s\" (UniqueName: \"kubernetes.io/projected/f46a25a2-f362-487c-9511-b9888a18b08e-kube-api-access-nn68s\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.344371 4765 reconciler_common.go:293] "Volume detached for volume \"inventory-0\" (UniqueName: \"kubernetes.io/secret/f46a25a2-f362-487c-9511-b9888a18b08e-inventory-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.498820 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.498895 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.549697 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.597702 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.597696 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ssh-known-hosts-edpm-deployment-gc6dd" event={"ID":"f46a25a2-f362-487c-9511-b9888a18b08e","Type":"ContainerDied","Data":"c5d7d35c651131233d793281304f22e22b5c5659409c6020d8c22d296942cd55"} Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.597822 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c5d7d35c651131233d793281304f22e22b5c5659409c6020d8c22d296942cd55" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.600696 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlzdk" event={"ID":"3b72f4e7-91e9-4dee-95c2-54d750064831","Type":"ContainerStarted","Data":"336006787dfa0b81d741d25923e84c6b41875fe5fd400bd71db9d5c9348cf9ff"} Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.637428 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-xlzdk" podStartSLOduration=3.996170918 podStartE2EDuration="6.637407434s" podCreationTimestamp="2026-03-19 10:55:32 +0000 UTC" firstStartedPulling="2026-03-19 10:55:35.566064277 +0000 UTC m=+2033.915009819" lastFinishedPulling="2026-03-19 10:55:38.207300793 +0000 UTC m=+2036.556246335" observedRunningTime="2026-03-19 10:55:38.619151489 +0000 UTC m=+2036.968097031" watchObservedRunningTime="2026-03-19 10:55:38.637407434 +0000 UTC m=+2036.986352976" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.648639 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.678797 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5"] Mar 19 10:55:38 crc kubenswrapper[4765]: E0319 10:55:38.679376 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f46a25a2-f362-487c-9511-b9888a18b08e" containerName="ssh-known-hosts-edpm-deployment" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.679401 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f46a25a2-f362-487c-9511-b9888a18b08e" containerName="ssh-known-hosts-edpm-deployment" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.679667 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f46a25a2-f362-487c-9511-b9888a18b08e" containerName="ssh-known-hosts-edpm-deployment" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.680459 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.684825 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.685667 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.686087 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.687247 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.698046 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5"] Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.854158 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.854340 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gpf2q\" (UniqueName: \"kubernetes.io/projected/0d2d350b-0950-4e12-8ae8-57c8983079aa-kube-api-access-gpf2q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.854380 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.956477 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gpf2q\" (UniqueName: \"kubernetes.io/projected/0d2d350b-0950-4e12-8ae8-57c8983079aa-kube-api-access-gpf2q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.956681 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.956769 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.962378 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-inventory\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.963124 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-ssh-key-openstack-edpm-ipam\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:38 crc kubenswrapper[4765]: I0319 10:55:38.981134 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gpf2q\" (UniqueName: \"kubernetes.io/projected/0d2d350b-0950-4e12-8ae8-57c8983079aa-kube-api-access-gpf2q\") pod \"run-os-edpm-deployment-openstack-edpm-ipam-jcnm5\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:39 crc kubenswrapper[4765]: I0319 10:55:39.004540 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:39 crc kubenswrapper[4765]: I0319 10:55:39.543730 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5"] Mar 19 10:55:39 crc kubenswrapper[4765]: I0319 10:55:39.609514 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" event={"ID":"0d2d350b-0950-4e12-8ae8-57c8983079aa","Type":"ContainerStarted","Data":"26f0edfd9da267a93b19f5350c67e015b1bdf6dc14ea86a65c19eed13a7cb1b9"} Mar 19 10:55:40 crc kubenswrapper[4765]: I0319 10:55:40.619685 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" event={"ID":"0d2d350b-0950-4e12-8ae8-57c8983079aa","Type":"ContainerStarted","Data":"4212347e17eedde275b6ab728d7e6bd16040a7f57aea1af438ecdae155d37501"} Mar 19 10:55:40 crc kubenswrapper[4765]: I0319 10:55:40.644743 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" podStartSLOduration=2.026742548 podStartE2EDuration="2.644721927s" podCreationTimestamp="2026-03-19 10:55:38 +0000 UTC" firstStartedPulling="2026-03-19 10:55:39.553344264 +0000 UTC m=+2037.902289806" lastFinishedPulling="2026-03-19 10:55:40.171323643 +0000 UTC m=+2038.520269185" observedRunningTime="2026-03-19 10:55:40.638904609 +0000 UTC m=+2038.987850181" watchObservedRunningTime="2026-03-19 10:55:40.644721927 +0000 UTC m=+2038.993667469" Mar 19 10:55:41 crc kubenswrapper[4765]: I0319 10:55:41.752233 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g58gl"] Mar 19 10:55:41 crc kubenswrapper[4765]: I0319 10:55:41.753088 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g58gl" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="registry-server" containerID="cri-o://3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92" gracePeriod=2 Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.171896 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.323626 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-utilities\") pod \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.323717 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dp84g\" (UniqueName: \"kubernetes.io/projected/b2941c2c-827b-4461-8bf0-1b2d75feafc2-kube-api-access-dp84g\") pod \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.323767 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-catalog-content\") pod \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\" (UID: \"b2941c2c-827b-4461-8bf0-1b2d75feafc2\") " Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.325267 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-utilities" (OuterVolumeSpecName: "utilities") pod "b2941c2c-827b-4461-8bf0-1b2d75feafc2" (UID: "b2941c2c-827b-4461-8bf0-1b2d75feafc2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.329767 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2941c2c-827b-4461-8bf0-1b2d75feafc2-kube-api-access-dp84g" (OuterVolumeSpecName: "kube-api-access-dp84g") pod "b2941c2c-827b-4461-8bf0-1b2d75feafc2" (UID: "b2941c2c-827b-4461-8bf0-1b2d75feafc2"). InnerVolumeSpecName "kube-api-access-dp84g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.382459 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b2941c2c-827b-4461-8bf0-1b2d75feafc2" (UID: "b2941c2c-827b-4461-8bf0-1b2d75feafc2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.426218 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.426273 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dp84g\" (UniqueName: \"kubernetes.io/projected/b2941c2c-827b-4461-8bf0-1b2d75feafc2-kube-api-access-dp84g\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.426288 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b2941c2c-827b-4461-8bf0-1b2d75feafc2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.637950 4765 generic.go:334] "Generic (PLEG): container finished" podID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerID="3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92" exitCode=0 Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.638029 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g58gl" event={"ID":"b2941c2c-827b-4461-8bf0-1b2d75feafc2","Type":"ContainerDied","Data":"3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92"} Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.638097 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g58gl" event={"ID":"b2941c2c-827b-4461-8bf0-1b2d75feafc2","Type":"ContainerDied","Data":"0fd08d355ef4620ccc2e6d3ad98fc897e916f4604462baf65d4a91f52cc430e3"} Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.638129 4765 scope.go:117] "RemoveContainer" containerID="3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.638383 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g58gl" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.670045 4765 scope.go:117] "RemoveContainer" containerID="2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.677196 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g58gl"] Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.685535 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g58gl"] Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.700198 4765 scope.go:117] "RemoveContainer" containerID="67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.732135 4765 scope.go:117] "RemoveContainer" containerID="3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92" Mar 19 10:55:42 crc kubenswrapper[4765]: E0319 10:55:42.732640 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92\": container with ID starting with 3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92 not found: ID does not exist" containerID="3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.732688 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92"} err="failed to get container status \"3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92\": rpc error: code = NotFound desc = could not find container \"3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92\": container with ID starting with 3bcc8f442fd11dad819b9de6c7dc8f2812afdf6caa37538807974a53ddf15d92 not found: ID does not exist" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.732716 4765 scope.go:117] "RemoveContainer" containerID="2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0" Mar 19 10:55:42 crc kubenswrapper[4765]: E0319 10:55:42.733206 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0\": container with ID starting with 2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0 not found: ID does not exist" containerID="2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.733348 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0"} err="failed to get container status \"2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0\": rpc error: code = NotFound desc = could not find container \"2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0\": container with ID starting with 2cac623e8e77c007a4b54308c163768ee6d107e612e4eb3c09017a01c61458b0 not found: ID does not exist" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.733436 4765 scope.go:117] "RemoveContainer" containerID="67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e" Mar 19 10:55:42 crc kubenswrapper[4765]: E0319 10:55:42.733822 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e\": container with ID starting with 67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e not found: ID does not exist" containerID="67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e" Mar 19 10:55:42 crc kubenswrapper[4765]: I0319 10:55:42.733909 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e"} err="failed to get container status \"67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e\": rpc error: code = NotFound desc = could not find container \"67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e\": container with ID starting with 67a5e80f5dc6370adf914fb88ed0fc3dc4144f5caa99492ee25da2c5f565705e not found: ID does not exist" Mar 19 10:55:43 crc kubenswrapper[4765]: I0319 10:55:43.342309 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:43 crc kubenswrapper[4765]: I0319 10:55:43.342674 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:43 crc kubenswrapper[4765]: I0319 10:55:43.390730 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:43 crc kubenswrapper[4765]: I0319 10:55:43.704504 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:44 crc kubenswrapper[4765]: I0319 10:55:44.386446 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" path="/var/lib/kubelet/pods/b2941c2c-827b-4461-8bf0-1b2d75feafc2/volumes" Mar 19 10:55:47 crc kubenswrapper[4765]: I0319 10:55:47.548066 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xlzdk"] Mar 19 10:55:47 crc kubenswrapper[4765]: I0319 10:55:47.548688 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-xlzdk" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="registry-server" containerID="cri-o://336006787dfa0b81d741d25923e84c6b41875fe5fd400bd71db9d5c9348cf9ff" gracePeriod=2 Mar 19 10:55:47 crc kubenswrapper[4765]: I0319 10:55:47.694770 4765 generic.go:334] "Generic (PLEG): container finished" podID="0d2d350b-0950-4e12-8ae8-57c8983079aa" containerID="4212347e17eedde275b6ab728d7e6bd16040a7f57aea1af438ecdae155d37501" exitCode=0 Mar 19 10:55:47 crc kubenswrapper[4765]: I0319 10:55:47.694847 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" event={"ID":"0d2d350b-0950-4e12-8ae8-57c8983079aa","Type":"ContainerDied","Data":"4212347e17eedde275b6ab728d7e6bd16040a7f57aea1af438ecdae155d37501"} Mar 19 10:55:47 crc kubenswrapper[4765]: I0319 10:55:47.699905 4765 generic.go:334] "Generic (PLEG): container finished" podID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerID="336006787dfa0b81d741d25923e84c6b41875fe5fd400bd71db9d5c9348cf9ff" exitCode=0 Mar 19 10:55:47 crc kubenswrapper[4765]: I0319 10:55:47.699978 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlzdk" event={"ID":"3b72f4e7-91e9-4dee-95c2-54d750064831","Type":"ContainerDied","Data":"336006787dfa0b81d741d25923e84c6b41875fe5fd400bd71db9d5c9348cf9ff"} Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.046949 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.237800 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-utilities\") pod \"3b72f4e7-91e9-4dee-95c2-54d750064831\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.237998 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42zc7\" (UniqueName: \"kubernetes.io/projected/3b72f4e7-91e9-4dee-95c2-54d750064831-kube-api-access-42zc7\") pod \"3b72f4e7-91e9-4dee-95c2-54d750064831\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.238051 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-catalog-content\") pod \"3b72f4e7-91e9-4dee-95c2-54d750064831\" (UID: \"3b72f4e7-91e9-4dee-95c2-54d750064831\") " Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.239462 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-utilities" (OuterVolumeSpecName: "utilities") pod "3b72f4e7-91e9-4dee-95c2-54d750064831" (UID: "3b72f4e7-91e9-4dee-95c2-54d750064831"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.246302 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b72f4e7-91e9-4dee-95c2-54d750064831-kube-api-access-42zc7" (OuterVolumeSpecName: "kube-api-access-42zc7") pod "3b72f4e7-91e9-4dee-95c2-54d750064831" (UID: "3b72f4e7-91e9-4dee-95c2-54d750064831"). InnerVolumeSpecName "kube-api-access-42zc7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.290634 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3b72f4e7-91e9-4dee-95c2-54d750064831" (UID: "3b72f4e7-91e9-4dee-95c2-54d750064831"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.340741 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.340780 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42zc7\" (UniqueName: \"kubernetes.io/projected/3b72f4e7-91e9-4dee-95c2-54d750064831-kube-api-access-42zc7\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.340792 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3b72f4e7-91e9-4dee-95c2-54d750064831-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.712355 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-xlzdk" event={"ID":"3b72f4e7-91e9-4dee-95c2-54d750064831","Type":"ContainerDied","Data":"a35ec999adb01c6a88d4181fe6864c8aa0f07ff98a09e147da798be8bf06d707"} Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.712425 4765 scope.go:117] "RemoveContainer" containerID="336006787dfa0b81d741d25923e84c6b41875fe5fd400bd71db9d5c9348cf9ff" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.713100 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-xlzdk" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.747326 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-xlzdk"] Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.747404 4765 scope.go:117] "RemoveContainer" containerID="7864e42ab4248cf183b3a18caa4c66d188f9d898e1ab23e5b0c481709be56871" Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.755486 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-xlzdk"] Mar 19 10:55:48 crc kubenswrapper[4765]: I0319 10:55:48.776498 4765 scope.go:117] "RemoveContainer" containerID="8edf1a3006aaaa72290a45f5e99ea172283e635200a91b4b048ee4224147f121" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.107367 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.262845 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-inventory\") pod \"0d2d350b-0950-4e12-8ae8-57c8983079aa\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.262890 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-ssh-key-openstack-edpm-ipam\") pod \"0d2d350b-0950-4e12-8ae8-57c8983079aa\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.263214 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gpf2q\" (UniqueName: \"kubernetes.io/projected/0d2d350b-0950-4e12-8ae8-57c8983079aa-kube-api-access-gpf2q\") pod \"0d2d350b-0950-4e12-8ae8-57c8983079aa\" (UID: \"0d2d350b-0950-4e12-8ae8-57c8983079aa\") " Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.268201 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d2d350b-0950-4e12-8ae8-57c8983079aa-kube-api-access-gpf2q" (OuterVolumeSpecName: "kube-api-access-gpf2q") pod "0d2d350b-0950-4e12-8ae8-57c8983079aa" (UID: "0d2d350b-0950-4e12-8ae8-57c8983079aa"). InnerVolumeSpecName "kube-api-access-gpf2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.298412 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "0d2d350b-0950-4e12-8ae8-57c8983079aa" (UID: "0d2d350b-0950-4e12-8ae8-57c8983079aa"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.301486 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-inventory" (OuterVolumeSpecName: "inventory") pod "0d2d350b-0950-4e12-8ae8-57c8983079aa" (UID: "0d2d350b-0950-4e12-8ae8-57c8983079aa"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.365777 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gpf2q\" (UniqueName: \"kubernetes.io/projected/0d2d350b-0950-4e12-8ae8-57c8983079aa-kube-api-access-gpf2q\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.366203 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.366222 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/0d2d350b-0950-4e12-8ae8-57c8983079aa-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.723168 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" event={"ID":"0d2d350b-0950-4e12-8ae8-57c8983079aa","Type":"ContainerDied","Data":"26f0edfd9da267a93b19f5350c67e015b1bdf6dc14ea86a65c19eed13a7cb1b9"} Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.723204 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="26f0edfd9da267a93b19f5350c67e015b1bdf6dc14ea86a65c19eed13a7cb1b9" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.723186 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/run-os-edpm-deployment-openstack-edpm-ipam-jcnm5" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.786664 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt"] Mar 19 10:55:49 crc kubenswrapper[4765]: E0319 10:55:49.787229 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="registry-server" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787246 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="registry-server" Mar 19 10:55:49 crc kubenswrapper[4765]: E0319 10:55:49.787272 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="extract-content" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787280 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="extract-content" Mar 19 10:55:49 crc kubenswrapper[4765]: E0319 10:55:49.787292 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="extract-utilities" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787298 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="extract-utilities" Mar 19 10:55:49 crc kubenswrapper[4765]: E0319 10:55:49.787317 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="registry-server" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787325 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="registry-server" Mar 19 10:55:49 crc kubenswrapper[4765]: E0319 10:55:49.787333 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="extract-content" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787340 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="extract-content" Mar 19 10:55:49 crc kubenswrapper[4765]: E0319 10:55:49.787362 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="extract-utilities" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787370 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="extract-utilities" Mar 19 10:55:49 crc kubenswrapper[4765]: E0319 10:55:49.787384 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d2d350b-0950-4e12-8ae8-57c8983079aa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787394 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d2d350b-0950-4e12-8ae8-57c8983079aa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787603 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d2d350b-0950-4e12-8ae8-57c8983079aa" containerName="run-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787620 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" containerName="registry-server" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.787629 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2941c2c-827b-4461-8bf0-1b2d75feafc2" containerName="registry-server" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.788377 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.790284 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.790507 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.790833 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.791020 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.801208 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt"] Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.876698 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.876760 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4x7s\" (UniqueName: \"kubernetes.io/projected/525004be-ff4e-4c2d-ad4d-0ed018eecc09-kube-api-access-z4x7s\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.876788 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.979318 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.979395 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4x7s\" (UniqueName: \"kubernetes.io/projected/525004be-ff4e-4c2d-ad4d-0ed018eecc09-kube-api-access-z4x7s\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.979440 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.984372 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-ssh-key-openstack-edpm-ipam\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:49 crc kubenswrapper[4765]: I0319 10:55:49.986450 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-inventory\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:50 crc kubenswrapper[4765]: I0319 10:55:50.005855 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4x7s\" (UniqueName: \"kubernetes.io/projected/525004be-ff4e-4c2d-ad4d-0ed018eecc09-kube-api-access-z4x7s\") pod \"reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:50 crc kubenswrapper[4765]: I0319 10:55:50.135476 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:55:50 crc kubenswrapper[4765]: I0319 10:55:50.372022 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b72f4e7-91e9-4dee-95c2-54d750064831" path="/var/lib/kubelet/pods/3b72f4e7-91e9-4dee-95c2-54d750064831/volumes" Mar 19 10:55:50 crc kubenswrapper[4765]: I0319 10:55:50.675551 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt"] Mar 19 10:55:50 crc kubenswrapper[4765]: I0319 10:55:50.733230 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" event={"ID":"525004be-ff4e-4c2d-ad4d-0ed018eecc09","Type":"ContainerStarted","Data":"e088210eefc02481ccc935533fad362e955d3dadd0f7af30942e5191fedcf839"} Mar 19 10:55:51 crc kubenswrapper[4765]: I0319 10:55:51.746294 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" event={"ID":"525004be-ff4e-4c2d-ad4d-0ed018eecc09","Type":"ContainerStarted","Data":"359a02bfc70b078f4071cabe4435f112ba9a98b0eaa54d2c9467c4c785124152"} Mar 19 10:55:51 crc kubenswrapper[4765]: I0319 10:55:51.769033 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" podStartSLOduration=2.365784662 podStartE2EDuration="2.769014325s" podCreationTimestamp="2026-03-19 10:55:49 +0000 UTC" firstStartedPulling="2026-03-19 10:55:50.681768624 +0000 UTC m=+2049.030714176" lastFinishedPulling="2026-03-19 10:55:51.084998297 +0000 UTC m=+2049.433943839" observedRunningTime="2026-03-19 10:55:51.765765837 +0000 UTC m=+2050.114711379" watchObservedRunningTime="2026-03-19 10:55:51.769014325 +0000 UTC m=+2050.117959857" Mar 19 10:55:55 crc kubenswrapper[4765]: I0319 10:55:55.047221 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/nova-cell1-cell-mapping-prtgb"] Mar 19 10:55:55 crc kubenswrapper[4765]: I0319 10:55:55.055229 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/nova-cell1-cell-mapping-prtgb"] Mar 19 10:55:56 crc kubenswrapper[4765]: I0319 10:55:56.379144 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1669f945-1247-495f-b598-de4d6703a5cd" path="/var/lib/kubelet/pods/1669f945-1247-495f-b598-de4d6703a5cd/volumes" Mar 19 10:55:59 crc kubenswrapper[4765]: I0319 10:55:59.821751 4765 generic.go:334] "Generic (PLEG): container finished" podID="525004be-ff4e-4c2d-ad4d-0ed018eecc09" containerID="359a02bfc70b078f4071cabe4435f112ba9a98b0eaa54d2c9467c4c785124152" exitCode=0 Mar 19 10:55:59 crc kubenswrapper[4765]: I0319 10:55:59.821841 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" event={"ID":"525004be-ff4e-4c2d-ad4d-0ed018eecc09","Type":"ContainerDied","Data":"359a02bfc70b078f4071cabe4435f112ba9a98b0eaa54d2c9467c4c785124152"} Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.135336 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565296-jm5j9"] Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.138122 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-jm5j9" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.141162 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.141201 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.141267 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.145667 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565296-jm5j9"] Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.161836 4765 scope.go:117] "RemoveContainer" containerID="746ccb08310d5fcb894b2b286fba879a7a461060b0fffa5c493abf918fecb04e" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.193683 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vtqk\" (UniqueName: \"kubernetes.io/projected/1b27cbdf-9041-4305-a969-586e8b4d09b4-kube-api-access-4vtqk\") pod \"auto-csr-approver-29565296-jm5j9\" (UID: \"1b27cbdf-9041-4305-a969-586e8b4d09b4\") " pod="openshift-infra/auto-csr-approver-29565296-jm5j9" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.216118 4765 scope.go:117] "RemoveContainer" containerID="36ccb5018079259f39da06eb2f26efb097b024f3fb278fd29fbcf63a153a015f" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.265486 4765 scope.go:117] "RemoveContainer" containerID="d41a94ebe6dac2fd803bb512fe51a25e92e5c37754d765f38f9c7968df6f685c" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.295885 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4vtqk\" (UniqueName: \"kubernetes.io/projected/1b27cbdf-9041-4305-a969-586e8b4d09b4-kube-api-access-4vtqk\") pod \"auto-csr-approver-29565296-jm5j9\" (UID: \"1b27cbdf-9041-4305-a969-586e8b4d09b4\") " pod="openshift-infra/auto-csr-approver-29565296-jm5j9" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.320218 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vtqk\" (UniqueName: \"kubernetes.io/projected/1b27cbdf-9041-4305-a969-586e8b4d09b4-kube-api-access-4vtqk\") pod \"auto-csr-approver-29565296-jm5j9\" (UID: \"1b27cbdf-9041-4305-a969-586e8b4d09b4\") " pod="openshift-infra/auto-csr-approver-29565296-jm5j9" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.459803 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-jm5j9" Mar 19 10:56:00 crc kubenswrapper[4765]: I0319 10:56:00.892245 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565296-jm5j9"] Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.147057 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.316724 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-inventory\") pod \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.317018 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-ssh-key-openstack-edpm-ipam\") pod \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.317096 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4x7s\" (UniqueName: \"kubernetes.io/projected/525004be-ff4e-4c2d-ad4d-0ed018eecc09-kube-api-access-z4x7s\") pod \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\" (UID: \"525004be-ff4e-4c2d-ad4d-0ed018eecc09\") " Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.323673 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/525004be-ff4e-4c2d-ad4d-0ed018eecc09-kube-api-access-z4x7s" (OuterVolumeSpecName: "kube-api-access-z4x7s") pod "525004be-ff4e-4c2d-ad4d-0ed018eecc09" (UID: "525004be-ff4e-4c2d-ad4d-0ed018eecc09"). InnerVolumeSpecName "kube-api-access-z4x7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.347226 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "525004be-ff4e-4c2d-ad4d-0ed018eecc09" (UID: "525004be-ff4e-4c2d-ad4d-0ed018eecc09"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.349122 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-inventory" (OuterVolumeSpecName: "inventory") pod "525004be-ff4e-4c2d-ad4d-0ed018eecc09" (UID: "525004be-ff4e-4c2d-ad4d-0ed018eecc09"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.419542 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4x7s\" (UniqueName: \"kubernetes.io/projected/525004be-ff4e-4c2d-ad4d-0ed018eecc09-kube-api-access-z4x7s\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.419575 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.419585 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/525004be-ff4e-4c2d-ad4d-0ed018eecc09-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.846036 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" event={"ID":"525004be-ff4e-4c2d-ad4d-0ed018eecc09","Type":"ContainerDied","Data":"e088210eefc02481ccc935533fad362e955d3dadd0f7af30942e5191fedcf839"} Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.846126 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e088210eefc02481ccc935533fad362e955d3dadd0f7af30942e5191fedcf839" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.846064 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.849043 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565296-jm5j9" event={"ID":"1b27cbdf-9041-4305-a969-586e8b4d09b4","Type":"ContainerStarted","Data":"978ff8c510b6d3c0e54de0171645c890c17cefb1215ecac05e031878f4122aad"} Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.935228 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd"] Mar 19 10:56:01 crc kubenswrapper[4765]: E0319 10:56:01.935644 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="525004be-ff4e-4c2d-ad4d-0ed018eecc09" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.935660 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="525004be-ff4e-4c2d-ad4d-0ed018eecc09" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.935832 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="525004be-ff4e-4c2d-ad4d-0ed018eecc09" containerName="reboot-os-edpm-deployment-openstack-edpm-ipam" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.936450 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.939787 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-neutron-metadata-default-certs-0" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.939948 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-ovn-default-certs-0" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.940361 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.940578 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-libvirt-default-certs-0" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.940667 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.940738 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.940361 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-telemetry-default-certs-0" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.941136 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:56:01 crc kubenswrapper[4765]: I0319 10:56:01.959552 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd"] Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044318 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044373 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044429 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044454 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044495 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044549 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044609 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044660 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn9gj\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-kube-api-access-sn9gj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044693 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044745 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044792 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044828 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044858 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.044910 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.146295 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.146612 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.146803 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.147023 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.147432 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.147616 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sn9gj\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-kube-api-access-sn9gj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.149593 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.150368 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.150632 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.150774 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.150889 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.150827 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-ovn-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.150835 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.151065 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.151447 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.151623 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.152880 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-nova-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.154797 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-telemetry-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.154886 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.155412 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-bootstrap-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.155671 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-neutron-metadata-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.155886 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.156246 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-libvirt-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.156627 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-inventory\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.157262 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-repo-setup-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.158473 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ovn-combined-ca-bundle\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.159624 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ssh-key-openstack-edpm-ipam\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.168204 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn9gj\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-kube-api-access-sn9gj\") pod \"install-certs-edpm-deployment-openstack-edpm-ipam-ppstd\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.256575 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.802335 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd"] Mar 19 10:56:02 crc kubenswrapper[4765]: W0319 10:56:02.852381 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podac71ef52_bfb2_44d0_be24_71e8e5e58475.slice/crio-83268ba566fe0f6ec6b22e190663377248b4c36d49b3f47a1beb8be2f1ab5bdf WatchSource:0}: Error finding container 83268ba566fe0f6ec6b22e190663377248b4c36d49b3f47a1beb8be2f1ab5bdf: Status 404 returned error can't find the container with id 83268ba566fe0f6ec6b22e190663377248b4c36d49b3f47a1beb8be2f1ab5bdf Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.870301 4765 generic.go:334] "Generic (PLEG): container finished" podID="1b27cbdf-9041-4305-a969-586e8b4d09b4" containerID="d92d543c3ed27b928edd3cf54bedff8a1dbc17673c4568aac83602be8ee7af84" exitCode=0 Mar 19 10:56:02 crc kubenswrapper[4765]: I0319 10:56:02.870349 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565296-jm5j9" event={"ID":"1b27cbdf-9041-4305-a969-586e8b4d09b4","Type":"ContainerDied","Data":"d92d543c3ed27b928edd3cf54bedff8a1dbc17673c4568aac83602be8ee7af84"} Mar 19 10:56:03 crc kubenswrapper[4765]: I0319 10:56:03.882764 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" event={"ID":"ac71ef52-bfb2-44d0-be24-71e8e5e58475","Type":"ContainerStarted","Data":"a9ea7f0222c7910aad27c513eb2d1f88da1545ade1e9df6d4ca35c5107583a0b"} Mar 19 10:56:03 crc kubenswrapper[4765]: I0319 10:56:03.883121 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" event={"ID":"ac71ef52-bfb2-44d0-be24-71e8e5e58475","Type":"ContainerStarted","Data":"83268ba566fe0f6ec6b22e190663377248b4c36d49b3f47a1beb8be2f1ab5bdf"} Mar 19 10:56:03 crc kubenswrapper[4765]: I0319 10:56:03.914914 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" podStartSLOduration=2.36615119 podStartE2EDuration="2.914895044s" podCreationTimestamp="2026-03-19 10:56:01 +0000 UTC" firstStartedPulling="2026-03-19 10:56:02.855777377 +0000 UTC m=+2061.204722919" lastFinishedPulling="2026-03-19 10:56:03.404521231 +0000 UTC m=+2061.753466773" observedRunningTime="2026-03-19 10:56:03.906380814 +0000 UTC m=+2062.255326386" watchObservedRunningTime="2026-03-19 10:56:03.914895044 +0000 UTC m=+2062.263840587" Mar 19 10:56:04 crc kubenswrapper[4765]: I0319 10:56:04.238272 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-jm5j9" Mar 19 10:56:04 crc kubenswrapper[4765]: I0319 10:56:04.397500 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vtqk\" (UniqueName: \"kubernetes.io/projected/1b27cbdf-9041-4305-a969-586e8b4d09b4-kube-api-access-4vtqk\") pod \"1b27cbdf-9041-4305-a969-586e8b4d09b4\" (UID: \"1b27cbdf-9041-4305-a969-586e8b4d09b4\") " Mar 19 10:56:04 crc kubenswrapper[4765]: I0319 10:56:04.405324 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b27cbdf-9041-4305-a969-586e8b4d09b4-kube-api-access-4vtqk" (OuterVolumeSpecName: "kube-api-access-4vtqk") pod "1b27cbdf-9041-4305-a969-586e8b4d09b4" (UID: "1b27cbdf-9041-4305-a969-586e8b4d09b4"). InnerVolumeSpecName "kube-api-access-4vtqk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:04 crc kubenswrapper[4765]: I0319 10:56:04.501228 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4vtqk\" (UniqueName: \"kubernetes.io/projected/1b27cbdf-9041-4305-a969-586e8b4d09b4-kube-api-access-4vtqk\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:04 crc kubenswrapper[4765]: I0319 10:56:04.892199 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565296-jm5j9" Mar 19 10:56:04 crc kubenswrapper[4765]: I0319 10:56:04.892213 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565296-jm5j9" event={"ID":"1b27cbdf-9041-4305-a969-586e8b4d09b4","Type":"ContainerDied","Data":"978ff8c510b6d3c0e54de0171645c890c17cefb1215ecac05e031878f4122aad"} Mar 19 10:56:04 crc kubenswrapper[4765]: I0319 10:56:04.892249 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="978ff8c510b6d3c0e54de0171645c890c17cefb1215ecac05e031878f4122aad" Mar 19 10:56:05 crc kubenswrapper[4765]: I0319 10:56:05.304004 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-zt5xt"] Mar 19 10:56:05 crc kubenswrapper[4765]: I0319 10:56:05.312109 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565290-zt5xt"] Mar 19 10:56:06 crc kubenswrapper[4765]: I0319 10:56:06.368566 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="057f3e2f-f77b-4da3-a67b-4f0777602577" path="/var/lib/kubelet/pods/057f3e2f-f77b-4da3-a67b-4f0777602577/volumes" Mar 19 10:56:35 crc kubenswrapper[4765]: I0319 10:56:35.157198 4765 generic.go:334] "Generic (PLEG): container finished" podID="ac71ef52-bfb2-44d0-be24-71e8e5e58475" containerID="a9ea7f0222c7910aad27c513eb2d1f88da1545ade1e9df6d4ca35c5107583a0b" exitCode=0 Mar 19 10:56:35 crc kubenswrapper[4765]: I0319 10:56:35.157290 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" event={"ID":"ac71ef52-bfb2-44d0-be24-71e8e5e58475","Type":"ContainerDied","Data":"a9ea7f0222c7910aad27c513eb2d1f88da1545ade1e9df6d4ca35c5107583a0b"} Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.609580 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.797920 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-repo-setup-combined-ca-bundle\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.797990 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-telemetry-default-certs-0\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798020 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-libvirt-combined-ca-bundle\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798041 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ssh-key-openstack-edpm-ipam\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798061 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-telemetry-combined-ca-bundle\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798085 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-nova-combined-ca-bundle\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798114 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn9gj\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-kube-api-access-sn9gj\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798174 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-inventory\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798198 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-bootstrap-combined-ca-bundle\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798235 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-libvirt-default-certs-0\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798253 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-neutron-metadata-combined-ca-bundle\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798327 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-neutron-metadata-default-certs-0\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798409 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-ovn-default-certs-0\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.798439 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ovn-combined-ca-bundle\") pod \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\" (UID: \"ac71ef52-bfb2-44d0-be24-71e8e5e58475\") " Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.805330 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-telemetry-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-telemetry-default-certs-0") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "openstack-edpm-ipam-telemetry-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.806239 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-ovn-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-ovn-default-certs-0") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "openstack-edpm-ipam-ovn-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.806425 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.806557 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-kube-api-access-sn9gj" (OuterVolumeSpecName: "kube-api-access-sn9gj") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "kube-api-access-sn9gj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.806734 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-bootstrap-combined-ca-bundle" (OuterVolumeSpecName: "bootstrap-combined-ca-bundle") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "bootstrap-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.806849 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.807551 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-libvirt-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-libvirt-default-certs-0") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "openstack-edpm-ipam-libvirt-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.808788 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-repo-setup-combined-ca-bundle" (OuterVolumeSpecName: "repo-setup-combined-ca-bundle") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "repo-setup-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.808812 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.810080 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-neutron-metadata-default-certs-0" (OuterVolumeSpecName: "openstack-edpm-ipam-neutron-metadata-default-certs-0") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "openstack-edpm-ipam-neutron-metadata-default-certs-0". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.810584 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.814387 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.834774 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.835405 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-inventory" (OuterVolumeSpecName: "inventory") pod "ac71ef52-bfb2-44d0-be24-71e8e5e58475" (UID: "ac71ef52-bfb2-44d0-be24-71e8e5e58475"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901173 4765 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901228 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901237 4765 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901245 4765 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901254 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sn9gj\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-kube-api-access-sn9gj\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901263 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901274 4765 reconciler_common.go:293] "Volume detached for volume \"bootstrap-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-bootstrap-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901285 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-libvirt-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-libvirt-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901295 4765 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901307 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-neutron-metadata-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-neutron-metadata-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901322 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-ovn-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-ovn-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901334 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901347 4765 reconciler_common.go:293] "Volume detached for volume \"repo-setup-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/ac71ef52-bfb2-44d0-be24-71e8e5e58475-repo-setup-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:36 crc kubenswrapper[4765]: I0319 10:56:36.901359 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-edpm-ipam-telemetry-default-certs-0\" (UniqueName: \"kubernetes.io/projected/ac71ef52-bfb2-44d0-be24-71e8e5e58475-openstack-edpm-ipam-telemetry-default-certs-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.176442 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" event={"ID":"ac71ef52-bfb2-44d0-be24-71e8e5e58475","Type":"ContainerDied","Data":"83268ba566fe0f6ec6b22e190663377248b4c36d49b3f47a1beb8be2f1ab5bdf"} Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.176495 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="83268ba566fe0f6ec6b22e190663377248b4c36d49b3f47a1beb8be2f1ab5bdf" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.176528 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/install-certs-edpm-deployment-openstack-edpm-ipam-ppstd" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.278350 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz"] Mar 19 10:56:37 crc kubenswrapper[4765]: E0319 10:56:37.278782 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b27cbdf-9041-4305-a969-586e8b4d09b4" containerName="oc" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.278800 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b27cbdf-9041-4305-a969-586e8b4d09b4" containerName="oc" Mar 19 10:56:37 crc kubenswrapper[4765]: E0319 10:56:37.278832 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac71ef52-bfb2-44d0-be24-71e8e5e58475" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.278841 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac71ef52-bfb2-44d0-be24-71e8e5e58475" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.279051 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b27cbdf-9041-4305-a969-586e8b4d09b4" containerName="oc" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.279081 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac71ef52-bfb2-44d0-be24-71e8e5e58475" containerName="install-certs-edpm-deployment-openstack-edpm-ipam" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.279789 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.292795 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz"] Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.293575 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.293731 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.293887 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.294028 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"ovncontroller-config" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.294176 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.413116 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2n4w8\" (UniqueName: \"kubernetes.io/projected/db0a9fa6-2229-425c-8170-ebcc7dce147f-kube-api-access-2n4w8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.413591 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.413752 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.413852 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.413947 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.516119 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2n4w8\" (UniqueName: \"kubernetes.io/projected/db0a9fa6-2229-425c-8170-ebcc7dce147f-kube-api-access-2n4w8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.516540 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.516657 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.516720 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.516761 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.518751 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovncontroller-config-0\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.522469 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovn-combined-ca-bundle\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.522794 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-inventory\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.523396 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ssh-key-openstack-edpm-ipam\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.539621 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2n4w8\" (UniqueName: \"kubernetes.io/projected/db0a9fa6-2229-425c-8170-ebcc7dce147f-kube-api-access-2n4w8\") pod \"ovn-edpm-deployment-openstack-edpm-ipam-fgjhz\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:37 crc kubenswrapper[4765]: I0319 10:56:37.605864 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:56:38 crc kubenswrapper[4765]: I0319 10:56:38.274239 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz"] Mar 19 10:56:39 crc kubenswrapper[4765]: I0319 10:56:39.196820 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" event={"ID":"db0a9fa6-2229-425c-8170-ebcc7dce147f","Type":"ContainerStarted","Data":"e375059a2ebd5d0ef914c120bef2ac12a0ee24b358ebe8f1b830a0e53d39bbe9"} Mar 19 10:56:40 crc kubenswrapper[4765]: I0319 10:56:40.205724 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" event={"ID":"db0a9fa6-2229-425c-8170-ebcc7dce147f","Type":"ContainerStarted","Data":"07a42f394a2246f74ff98de4258e5d162c976a50bc05c81e669490360687d29f"} Mar 19 10:56:40 crc kubenswrapper[4765]: I0319 10:56:40.222874 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" podStartSLOduration=2.378767282 podStartE2EDuration="3.222844623s" podCreationTimestamp="2026-03-19 10:56:37 +0000 UTC" firstStartedPulling="2026-03-19 10:56:38.346205806 +0000 UTC m=+2096.695151348" lastFinishedPulling="2026-03-19 10:56:39.190283147 +0000 UTC m=+2097.539228689" observedRunningTime="2026-03-19 10:56:40.220846499 +0000 UTC m=+2098.569792061" watchObservedRunningTime="2026-03-19 10:56:40.222844623 +0000 UTC m=+2098.571790165" Mar 19 10:57:00 crc kubenswrapper[4765]: I0319 10:57:00.438080 4765 scope.go:117] "RemoveContainer" containerID="09c9fea2cef66298e6fbd2673086ad3b083adde2f4a2e45fb7f67dcb13962309" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.566390 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-89wd2"] Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.568947 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.652123 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89wd2"] Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.753050 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zpkjj\" (UniqueName: \"kubernetes.io/projected/3bb59577-a66f-47e0-8340-592edad7a573-kube-api-access-zpkjj\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.753134 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-catalog-content\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.753297 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-utilities\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.855141 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zpkjj\" (UniqueName: \"kubernetes.io/projected/3bb59577-a66f-47e0-8340-592edad7a573-kube-api-access-zpkjj\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.855208 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-catalog-content\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.855237 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-utilities\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.855721 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-utilities\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.855829 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-catalog-content\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.878739 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zpkjj\" (UniqueName: \"kubernetes.io/projected/3bb59577-a66f-47e0-8340-592edad7a573-kube-api-access-zpkjj\") pod \"redhat-operators-89wd2\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:28 crc kubenswrapper[4765]: I0319 10:57:28.888846 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:29 crc kubenswrapper[4765]: I0319 10:57:29.351624 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-89wd2"] Mar 19 10:57:29 crc kubenswrapper[4765]: W0319 10:57:29.358546 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3bb59577_a66f_47e0_8340_592edad7a573.slice/crio-a5a6049cdf25b5351607ce379ccff1f811efbd412c84ce5f0b503e5957d0fb09 WatchSource:0}: Error finding container a5a6049cdf25b5351607ce379ccff1f811efbd412c84ce5f0b503e5957d0fb09: Status 404 returned error can't find the container with id a5a6049cdf25b5351607ce379ccff1f811efbd412c84ce5f0b503e5957d0fb09 Mar 19 10:57:29 crc kubenswrapper[4765]: I0319 10:57:29.735035 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bb59577-a66f-47e0-8340-592edad7a573" containerID="aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08" exitCode=0 Mar 19 10:57:29 crc kubenswrapper[4765]: I0319 10:57:29.735103 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89wd2" event={"ID":"3bb59577-a66f-47e0-8340-592edad7a573","Type":"ContainerDied","Data":"aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08"} Mar 19 10:57:29 crc kubenswrapper[4765]: I0319 10:57:29.735379 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89wd2" event={"ID":"3bb59577-a66f-47e0-8340-592edad7a573","Type":"ContainerStarted","Data":"a5a6049cdf25b5351607ce379ccff1f811efbd412c84ce5f0b503e5957d0fb09"} Mar 19 10:57:30 crc kubenswrapper[4765]: I0319 10:57:30.746607 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89wd2" event={"ID":"3bb59577-a66f-47e0-8340-592edad7a573","Type":"ContainerStarted","Data":"524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561"} Mar 19 10:57:31 crc kubenswrapper[4765]: I0319 10:57:31.656594 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:57:31 crc kubenswrapper[4765]: I0319 10:57:31.656670 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:57:31 crc kubenswrapper[4765]: I0319 10:57:31.760062 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bb59577-a66f-47e0-8340-592edad7a573" containerID="524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561" exitCode=0 Mar 19 10:57:31 crc kubenswrapper[4765]: I0319 10:57:31.761060 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89wd2" event={"ID":"3bb59577-a66f-47e0-8340-592edad7a573","Type":"ContainerDied","Data":"524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561"} Mar 19 10:57:32 crc kubenswrapper[4765]: I0319 10:57:32.770713 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89wd2" event={"ID":"3bb59577-a66f-47e0-8340-592edad7a573","Type":"ContainerStarted","Data":"b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c"} Mar 19 10:57:32 crc kubenswrapper[4765]: I0319 10:57:32.792815 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-89wd2" podStartSLOduration=2.360630426 podStartE2EDuration="4.792796833s" podCreationTimestamp="2026-03-19 10:57:28 +0000 UTC" firstStartedPulling="2026-03-19 10:57:29.736765789 +0000 UTC m=+2148.085711331" lastFinishedPulling="2026-03-19 10:57:32.168932196 +0000 UTC m=+2150.517877738" observedRunningTime="2026-03-19 10:57:32.785946068 +0000 UTC m=+2151.134891620" watchObservedRunningTime="2026-03-19 10:57:32.792796833 +0000 UTC m=+2151.141742375" Mar 19 10:57:35 crc kubenswrapper[4765]: I0319 10:57:35.799020 4765 generic.go:334] "Generic (PLEG): container finished" podID="db0a9fa6-2229-425c-8170-ebcc7dce147f" containerID="07a42f394a2246f74ff98de4258e5d162c976a50bc05c81e669490360687d29f" exitCode=0 Mar 19 10:57:35 crc kubenswrapper[4765]: I0319 10:57:35.799125 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" event={"ID":"db0a9fa6-2229-425c-8170-ebcc7dce147f","Type":"ContainerDied","Data":"07a42f394a2246f74ff98de4258e5d162c976a50bc05c81e669490360687d29f"} Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.217165 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.340446 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ssh-key-openstack-edpm-ipam\") pod \"db0a9fa6-2229-425c-8170-ebcc7dce147f\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.340583 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2n4w8\" (UniqueName: \"kubernetes.io/projected/db0a9fa6-2229-425c-8170-ebcc7dce147f-kube-api-access-2n4w8\") pod \"db0a9fa6-2229-425c-8170-ebcc7dce147f\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.340626 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-inventory\") pod \"db0a9fa6-2229-425c-8170-ebcc7dce147f\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.340860 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovncontroller-config-0\") pod \"db0a9fa6-2229-425c-8170-ebcc7dce147f\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.340894 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovn-combined-ca-bundle\") pod \"db0a9fa6-2229-425c-8170-ebcc7dce147f\" (UID: \"db0a9fa6-2229-425c-8170-ebcc7dce147f\") " Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.347765 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db0a9fa6-2229-425c-8170-ebcc7dce147f-kube-api-access-2n4w8" (OuterVolumeSpecName: "kube-api-access-2n4w8") pod "db0a9fa6-2229-425c-8170-ebcc7dce147f" (UID: "db0a9fa6-2229-425c-8170-ebcc7dce147f"). InnerVolumeSpecName "kube-api-access-2n4w8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.347765 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovn-combined-ca-bundle" (OuterVolumeSpecName: "ovn-combined-ca-bundle") pod "db0a9fa6-2229-425c-8170-ebcc7dce147f" (UID: "db0a9fa6-2229-425c-8170-ebcc7dce147f"). InnerVolumeSpecName "ovn-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.371996 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovncontroller-config-0" (OuterVolumeSpecName: "ovncontroller-config-0") pod "db0a9fa6-2229-425c-8170-ebcc7dce147f" (UID: "db0a9fa6-2229-425c-8170-ebcc7dce147f"). InnerVolumeSpecName "ovncontroller-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.376205 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "db0a9fa6-2229-425c-8170-ebcc7dce147f" (UID: "db0a9fa6-2229-425c-8170-ebcc7dce147f"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.377123 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-inventory" (OuterVolumeSpecName: "inventory") pod "db0a9fa6-2229-425c-8170-ebcc7dce147f" (UID: "db0a9fa6-2229-425c-8170-ebcc7dce147f"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.443795 4765 reconciler_common.go:293] "Volume detached for volume \"ovncontroller-config-0\" (UniqueName: \"kubernetes.io/configmap/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovncontroller-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.443845 4765 reconciler_common.go:293] "Volume detached for volume \"ovn-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ovn-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.443859 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.443874 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2n4w8\" (UniqueName: \"kubernetes.io/projected/db0a9fa6-2229-425c-8170-ebcc7dce147f-kube-api-access-2n4w8\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.443888 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/db0a9fa6-2229-425c-8170-ebcc7dce147f-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.817917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" event={"ID":"db0a9fa6-2229-425c-8170-ebcc7dce147f","Type":"ContainerDied","Data":"e375059a2ebd5d0ef914c120bef2ac12a0ee24b358ebe8f1b830a0e53d39bbe9"} Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.818033 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e375059a2ebd5d0ef914c120bef2ac12a0ee24b358ebe8f1b830a0e53d39bbe9" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.818027 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/ovn-edpm-deployment-openstack-edpm-ipam-fgjhz" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.927216 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt"] Mar 19 10:57:37 crc kubenswrapper[4765]: E0319 10:57:37.927914 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db0a9fa6-2229-425c-8170-ebcc7dce147f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.927935 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="db0a9fa6-2229-425c-8170-ebcc7dce147f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.928159 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="db0a9fa6-2229-425c-8170-ebcc7dce147f" containerName="ovn-edpm-deployment-openstack-edpm-ipam" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.928783 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.933276 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"neutron-ovn-metadata-agent-neutron-config" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.933276 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.933310 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.933318 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-metadata-neutron-config" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.933432 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.933755 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:57:37 crc kubenswrapper[4765]: I0319 10:57:37.945521 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt"] Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.061041 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.061129 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.061194 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swp9v\" (UniqueName: \"kubernetes.io/projected/5987706f-bbd1-4eeb-908e-dd158089aea5-kube-api-access-swp9v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.061253 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.061302 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.061345 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.162902 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.163062 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swp9v\" (UniqueName: \"kubernetes.io/projected/5987706f-bbd1-4eeb-908e-dd158089aea5-kube-api-access-swp9v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.163108 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.163141 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.163179 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.163302 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.168815 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-ssh-key-openstack-edpm-ipam\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.168947 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.169934 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-inventory\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.170275 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-metadata-combined-ca-bundle\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.170613 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-nova-metadata-neutron-config-0\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.183536 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swp9v\" (UniqueName: \"kubernetes.io/projected/5987706f-bbd1-4eeb-908e-dd158089aea5-kube-api-access-swp9v\") pod \"neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.246577 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.778889 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt"] Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.860570 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" event={"ID":"5987706f-bbd1-4eeb-908e-dd158089aea5","Type":"ContainerStarted","Data":"d345e4abd50d1eb89fd721937f1b423d61da94b2b3400b258b459cbbd1755a8a"} Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.889507 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.889557 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:38 crc kubenswrapper[4765]: I0319 10:57:38.935831 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:39 crc kubenswrapper[4765]: I0319 10:57:39.872383 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" event={"ID":"5987706f-bbd1-4eeb-908e-dd158089aea5","Type":"ContainerStarted","Data":"a73ef99c128e7e23953b346829594117340ea0e4dd044f25fca782e8cd9aa805"} Mar 19 10:57:39 crc kubenswrapper[4765]: I0319 10:57:39.897441 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" podStartSLOduration=2.329569442 podStartE2EDuration="2.897418465s" podCreationTimestamp="2026-03-19 10:57:37 +0000 UTC" firstStartedPulling="2026-03-19 10:57:38.781999918 +0000 UTC m=+2157.130945460" lastFinishedPulling="2026-03-19 10:57:39.349848921 +0000 UTC m=+2157.698794483" observedRunningTime="2026-03-19 10:57:39.885475652 +0000 UTC m=+2158.234421204" watchObservedRunningTime="2026-03-19 10:57:39.897418465 +0000 UTC m=+2158.246364017" Mar 19 10:57:39 crc kubenswrapper[4765]: I0319 10:57:39.929198 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:39 crc kubenswrapper[4765]: I0319 10:57:39.989276 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89wd2"] Mar 19 10:57:41 crc kubenswrapper[4765]: I0319 10:57:41.893271 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-89wd2" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="registry-server" containerID="cri-o://b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c" gracePeriod=2 Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.345718 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.449272 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zpkjj\" (UniqueName: \"kubernetes.io/projected/3bb59577-a66f-47e0-8340-592edad7a573-kube-api-access-zpkjj\") pod \"3bb59577-a66f-47e0-8340-592edad7a573\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.449481 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-catalog-content\") pod \"3bb59577-a66f-47e0-8340-592edad7a573\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.449662 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-utilities\") pod \"3bb59577-a66f-47e0-8340-592edad7a573\" (UID: \"3bb59577-a66f-47e0-8340-592edad7a573\") " Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.450506 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-utilities" (OuterVolumeSpecName: "utilities") pod "3bb59577-a66f-47e0-8340-592edad7a573" (UID: "3bb59577-a66f-47e0-8340-592edad7a573"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.457405 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb59577-a66f-47e0-8340-592edad7a573-kube-api-access-zpkjj" (OuterVolumeSpecName: "kube-api-access-zpkjj") pod "3bb59577-a66f-47e0-8340-592edad7a573" (UID: "3bb59577-a66f-47e0-8340-592edad7a573"). InnerVolumeSpecName "kube-api-access-zpkjj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.553087 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zpkjj\" (UniqueName: \"kubernetes.io/projected/3bb59577-a66f-47e0-8340-592edad7a573-kube-api-access-zpkjj\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.553116 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.587342 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bb59577-a66f-47e0-8340-592edad7a573" (UID: "3bb59577-a66f-47e0-8340-592edad7a573"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.655571 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bb59577-a66f-47e0-8340-592edad7a573-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.901507 4765 generic.go:334] "Generic (PLEG): container finished" podID="3bb59577-a66f-47e0-8340-592edad7a573" containerID="b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c" exitCode=0 Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.901564 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-89wd2" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.901560 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89wd2" event={"ID":"3bb59577-a66f-47e0-8340-592edad7a573","Type":"ContainerDied","Data":"b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c"} Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.902069 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-89wd2" event={"ID":"3bb59577-a66f-47e0-8340-592edad7a573","Type":"ContainerDied","Data":"a5a6049cdf25b5351607ce379ccff1f811efbd412c84ce5f0b503e5957d0fb09"} Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.902101 4765 scope.go:117] "RemoveContainer" containerID="b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.935181 4765 scope.go:117] "RemoveContainer" containerID="524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561" Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.942056 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-89wd2"] Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.953543 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-89wd2"] Mar 19 10:57:42 crc kubenswrapper[4765]: I0319 10:57:42.958125 4765 scope.go:117] "RemoveContainer" containerID="aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08" Mar 19 10:57:43 crc kubenswrapper[4765]: I0319 10:57:43.003054 4765 scope.go:117] "RemoveContainer" containerID="b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c" Mar 19 10:57:43 crc kubenswrapper[4765]: E0319 10:57:43.004134 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c\": container with ID starting with b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c not found: ID does not exist" containerID="b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c" Mar 19 10:57:43 crc kubenswrapper[4765]: I0319 10:57:43.004164 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c"} err="failed to get container status \"b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c\": rpc error: code = NotFound desc = could not find container \"b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c\": container with ID starting with b786ab70f1170909494b06779ef1614521533170b183d47fd9e8170d9ba91c2c not found: ID does not exist" Mar 19 10:57:43 crc kubenswrapper[4765]: I0319 10:57:43.004187 4765 scope.go:117] "RemoveContainer" containerID="524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561" Mar 19 10:57:43 crc kubenswrapper[4765]: E0319 10:57:43.004469 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561\": container with ID starting with 524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561 not found: ID does not exist" containerID="524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561" Mar 19 10:57:43 crc kubenswrapper[4765]: I0319 10:57:43.004545 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561"} err="failed to get container status \"524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561\": rpc error: code = NotFound desc = could not find container \"524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561\": container with ID starting with 524d9eae736e03843cadcbb9b7afea2b599a2980271ed6a2c2743aaa40f77561 not found: ID does not exist" Mar 19 10:57:43 crc kubenswrapper[4765]: I0319 10:57:43.004614 4765 scope.go:117] "RemoveContainer" containerID="aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08" Mar 19 10:57:43 crc kubenswrapper[4765]: E0319 10:57:43.004909 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08\": container with ID starting with aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08 not found: ID does not exist" containerID="aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08" Mar 19 10:57:43 crc kubenswrapper[4765]: I0319 10:57:43.004932 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08"} err="failed to get container status \"aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08\": rpc error: code = NotFound desc = could not find container \"aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08\": container with ID starting with aa7ce58894a591c1ca4a074fbfaeb94081f1d407885efcaca6825598e6ca0a08 not found: ID does not exist" Mar 19 10:57:44 crc kubenswrapper[4765]: I0319 10:57:44.366172 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb59577-a66f-47e0-8340-592edad7a573" path="/var/lib/kubelet/pods/3bb59577-a66f-47e0-8340-592edad7a573/volumes" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.133093 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565298-zd8jt"] Mar 19 10:58:00 crc kubenswrapper[4765]: E0319 10:58:00.134197 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="registry-server" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.134320 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="registry-server" Mar 19 10:58:00 crc kubenswrapper[4765]: E0319 10:58:00.134348 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="extract-utilities" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.134357 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="extract-utilities" Mar 19 10:58:00 crc kubenswrapper[4765]: E0319 10:58:00.134390 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="extract-content" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.134399 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="extract-content" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.134687 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb59577-a66f-47e0-8340-592edad7a573" containerName="registry-server" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.135597 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-zd8jt" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.138805 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.139139 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.139436 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.145551 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565298-zd8jt"] Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.297216 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmv6d\" (UniqueName: \"kubernetes.io/projected/555dd4fc-9bc7-406b-b094-409a3231e411-kube-api-access-rmv6d\") pod \"auto-csr-approver-29565298-zd8jt\" (UID: \"555dd4fc-9bc7-406b-b094-409a3231e411\") " pod="openshift-infra/auto-csr-approver-29565298-zd8jt" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.399452 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmv6d\" (UniqueName: \"kubernetes.io/projected/555dd4fc-9bc7-406b-b094-409a3231e411-kube-api-access-rmv6d\") pod \"auto-csr-approver-29565298-zd8jt\" (UID: \"555dd4fc-9bc7-406b-b094-409a3231e411\") " pod="openshift-infra/auto-csr-approver-29565298-zd8jt" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.417018 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmv6d\" (UniqueName: \"kubernetes.io/projected/555dd4fc-9bc7-406b-b094-409a3231e411-kube-api-access-rmv6d\") pod \"auto-csr-approver-29565298-zd8jt\" (UID: \"555dd4fc-9bc7-406b-b094-409a3231e411\") " pod="openshift-infra/auto-csr-approver-29565298-zd8jt" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.463397 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-zd8jt" Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.897587 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565298-zd8jt"] Mar 19 10:58:00 crc kubenswrapper[4765]: I0319 10:58:00.908228 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 10:58:01 crc kubenswrapper[4765]: I0319 10:58:01.056418 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565298-zd8jt" event={"ID":"555dd4fc-9bc7-406b-b094-409a3231e411","Type":"ContainerStarted","Data":"675b4e24d092d665e3b1ae6f0af5106a6cc1fef23ad8590bd694aae1d2ca79c2"} Mar 19 10:58:01 crc kubenswrapper[4765]: I0319 10:58:01.656414 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:58:01 crc kubenswrapper[4765]: I0319 10:58:01.656484 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:58:03 crc kubenswrapper[4765]: I0319 10:58:03.074939 4765 generic.go:334] "Generic (PLEG): container finished" podID="555dd4fc-9bc7-406b-b094-409a3231e411" containerID="13252fa8860b4c68428c4a9bb1f57507f81f71690225a633cb5da068dc3a1148" exitCode=0 Mar 19 10:58:03 crc kubenswrapper[4765]: I0319 10:58:03.075096 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565298-zd8jt" event={"ID":"555dd4fc-9bc7-406b-b094-409a3231e411","Type":"ContainerDied","Data":"13252fa8860b4c68428c4a9bb1f57507f81f71690225a633cb5da068dc3a1148"} Mar 19 10:58:04 crc kubenswrapper[4765]: I0319 10:58:04.423493 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-zd8jt" Mar 19 10:58:04 crc kubenswrapper[4765]: I0319 10:58:04.579679 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rmv6d\" (UniqueName: \"kubernetes.io/projected/555dd4fc-9bc7-406b-b094-409a3231e411-kube-api-access-rmv6d\") pod \"555dd4fc-9bc7-406b-b094-409a3231e411\" (UID: \"555dd4fc-9bc7-406b-b094-409a3231e411\") " Mar 19 10:58:04 crc kubenswrapper[4765]: I0319 10:58:04.585653 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/555dd4fc-9bc7-406b-b094-409a3231e411-kube-api-access-rmv6d" (OuterVolumeSpecName: "kube-api-access-rmv6d") pod "555dd4fc-9bc7-406b-b094-409a3231e411" (UID: "555dd4fc-9bc7-406b-b094-409a3231e411"). InnerVolumeSpecName "kube-api-access-rmv6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:58:04 crc kubenswrapper[4765]: I0319 10:58:04.682354 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rmv6d\" (UniqueName: \"kubernetes.io/projected/555dd4fc-9bc7-406b-b094-409a3231e411-kube-api-access-rmv6d\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:05 crc kubenswrapper[4765]: I0319 10:58:05.094515 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565298-zd8jt" event={"ID":"555dd4fc-9bc7-406b-b094-409a3231e411","Type":"ContainerDied","Data":"675b4e24d092d665e3b1ae6f0af5106a6cc1fef23ad8590bd694aae1d2ca79c2"} Mar 19 10:58:05 crc kubenswrapper[4765]: I0319 10:58:05.094563 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="675b4e24d092d665e3b1ae6f0af5106a6cc1fef23ad8590bd694aae1d2ca79c2" Mar 19 10:58:05 crc kubenswrapper[4765]: I0319 10:58:05.094622 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565298-zd8jt" Mar 19 10:58:05 crc kubenswrapper[4765]: I0319 10:58:05.495760 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-w8fpg"] Mar 19 10:58:05 crc kubenswrapper[4765]: I0319 10:58:05.506872 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565292-w8fpg"] Mar 19 10:58:06 crc kubenswrapper[4765]: I0319 10:58:06.391162 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="257ec543-fc43-45ea-b218-5c7772da983f" path="/var/lib/kubelet/pods/257ec543-fc43-45ea-b218-5c7772da983f/volumes" Mar 19 10:58:22 crc kubenswrapper[4765]: I0319 10:58:22.244886 4765 generic.go:334] "Generic (PLEG): container finished" podID="5987706f-bbd1-4eeb-908e-dd158089aea5" containerID="a73ef99c128e7e23953b346829594117340ea0e4dd044f25fca782e8cd9aa805" exitCode=0 Mar 19 10:58:22 crc kubenswrapper[4765]: I0319 10:58:22.245003 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" event={"ID":"5987706f-bbd1-4eeb-908e-dd158089aea5","Type":"ContainerDied","Data":"a73ef99c128e7e23953b346829594117340ea0e4dd044f25fca782e8cd9aa805"} Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.862357 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.981874 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-ovn-metadata-agent-neutron-config-0\") pod \"5987706f-bbd1-4eeb-908e-dd158089aea5\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.982350 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-ssh-key-openstack-edpm-ipam\") pod \"5987706f-bbd1-4eeb-908e-dd158089aea5\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.982421 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-nova-metadata-neutron-config-0\") pod \"5987706f-bbd1-4eeb-908e-dd158089aea5\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.982491 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-inventory\") pod \"5987706f-bbd1-4eeb-908e-dd158089aea5\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.982583 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-metadata-combined-ca-bundle\") pod \"5987706f-bbd1-4eeb-908e-dd158089aea5\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.983276 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swp9v\" (UniqueName: \"kubernetes.io/projected/5987706f-bbd1-4eeb-908e-dd158089aea5-kube-api-access-swp9v\") pod \"5987706f-bbd1-4eeb-908e-dd158089aea5\" (UID: \"5987706f-bbd1-4eeb-908e-dd158089aea5\") " Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.989030 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-metadata-combined-ca-bundle" (OuterVolumeSpecName: "neutron-metadata-combined-ca-bundle") pod "5987706f-bbd1-4eeb-908e-dd158089aea5" (UID: "5987706f-bbd1-4eeb-908e-dd158089aea5"). InnerVolumeSpecName "neutron-metadata-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:58:23 crc kubenswrapper[4765]: I0319 10:58:23.989991 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5987706f-bbd1-4eeb-908e-dd158089aea5-kube-api-access-swp9v" (OuterVolumeSpecName: "kube-api-access-swp9v") pod "5987706f-bbd1-4eeb-908e-dd158089aea5" (UID: "5987706f-bbd1-4eeb-908e-dd158089aea5"). InnerVolumeSpecName "kube-api-access-swp9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.019418 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-nova-metadata-neutron-config-0" (OuterVolumeSpecName: "nova-metadata-neutron-config-0") pod "5987706f-bbd1-4eeb-908e-dd158089aea5" (UID: "5987706f-bbd1-4eeb-908e-dd158089aea5"). InnerVolumeSpecName "nova-metadata-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.020246 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-ovn-metadata-agent-neutron-config-0" (OuterVolumeSpecName: "neutron-ovn-metadata-agent-neutron-config-0") pod "5987706f-bbd1-4eeb-908e-dd158089aea5" (UID: "5987706f-bbd1-4eeb-908e-dd158089aea5"). InnerVolumeSpecName "neutron-ovn-metadata-agent-neutron-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.024514 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-inventory" (OuterVolumeSpecName: "inventory") pod "5987706f-bbd1-4eeb-908e-dd158089aea5" (UID: "5987706f-bbd1-4eeb-908e-dd158089aea5"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.027433 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "5987706f-bbd1-4eeb-908e-dd158089aea5" (UID: "5987706f-bbd1-4eeb-908e-dd158089aea5"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.086241 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swp9v\" (UniqueName: \"kubernetes.io/projected/5987706f-bbd1-4eeb-908e-dd158089aea5-kube-api-access-swp9v\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.086350 4765 reconciler_common.go:293] "Volume detached for volume \"neutron-ovn-metadata-agent-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-ovn-metadata-agent-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.086407 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.086424 4765 reconciler_common.go:293] "Volume detached for volume \"nova-metadata-neutron-config-0\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-nova-metadata-neutron-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.086435 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.086448 4765 reconciler_common.go:293] "Volume detached for volume \"neutron-metadata-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/5987706f-bbd1-4eeb-908e-dd158089aea5-neutron-metadata-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.265678 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" event={"ID":"5987706f-bbd1-4eeb-908e-dd158089aea5","Type":"ContainerDied","Data":"d345e4abd50d1eb89fd721937f1b423d61da94b2b3400b258b459cbbd1755a8a"} Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.265720 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d345e4abd50d1eb89fd721937f1b423d61da94b2b3400b258b459cbbd1755a8a" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.265768 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.369974 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb"] Mar 19 10:58:24 crc kubenswrapper[4765]: E0319 10:58:24.370384 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5987706f-bbd1-4eeb-908e-dd158089aea5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.370403 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5987706f-bbd1-4eeb-908e-dd158089aea5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 10:58:24 crc kubenswrapper[4765]: E0319 10:58:24.370424 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="555dd4fc-9bc7-406b-b094-409a3231e411" containerName="oc" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.370433 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="555dd4fc-9bc7-406b-b094-409a3231e411" containerName="oc" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.370663 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5987706f-bbd1-4eeb-908e-dd158089aea5" containerName="neutron-metadata-edpm-deployment-openstack-edpm-ipam" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.370683 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="555dd4fc-9bc7-406b-b094-409a3231e411" containerName="oc" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.371426 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.373882 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.374232 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.374434 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"libvirt-secret" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.374644 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.375191 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.393627 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb"] Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.394518 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.394616 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mqms8\" (UniqueName: \"kubernetes.io/projected/895f9304-5267-4b0b-acac-7e0d279b8866-kube-api-access-mqms8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.394654 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.395062 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.395240 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.496109 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.496208 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.496254 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.496355 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mqms8\" (UniqueName: \"kubernetes.io/projected/895f9304-5267-4b0b-acac-7e0d279b8866-kube-api-access-mqms8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.496403 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.501868 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-inventory\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.501915 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-combined-ca-bundle\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.502326 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-secret-0\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.505685 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-ssh-key-openstack-edpm-ipam\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.522175 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mqms8\" (UniqueName: \"kubernetes.io/projected/895f9304-5267-4b0b-acac-7e0d279b8866-kube-api-access-mqms8\") pod \"libvirt-edpm-deployment-openstack-edpm-ipam-kplgb\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:24 crc kubenswrapper[4765]: I0319 10:58:24.689763 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 10:58:25 crc kubenswrapper[4765]: I0319 10:58:25.342756 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb"] Mar 19 10:58:25 crc kubenswrapper[4765]: W0319 10:58:25.364701 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod895f9304_5267_4b0b_acac_7e0d279b8866.slice/crio-24b1d53baff6aa6430de704d04b517e9cf8a4de85ba7fafee6a26a5fdeeda7a6 WatchSource:0}: Error finding container 24b1d53baff6aa6430de704d04b517e9cf8a4de85ba7fafee6a26a5fdeeda7a6: Status 404 returned error can't find the container with id 24b1d53baff6aa6430de704d04b517e9cf8a4de85ba7fafee6a26a5fdeeda7a6 Mar 19 10:58:26 crc kubenswrapper[4765]: I0319 10:58:26.297870 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" event={"ID":"895f9304-5267-4b0b-acac-7e0d279b8866","Type":"ContainerStarted","Data":"ec61750926bbc5a07485d406cf4fb854ef984294b86c17dacaab1c96d923f95c"} Mar 19 10:58:26 crc kubenswrapper[4765]: I0319 10:58:26.298512 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" event={"ID":"895f9304-5267-4b0b-acac-7e0d279b8866","Type":"ContainerStarted","Data":"24b1d53baff6aa6430de704d04b517e9cf8a4de85ba7fafee6a26a5fdeeda7a6"} Mar 19 10:58:26 crc kubenswrapper[4765]: I0319 10:58:26.320069 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" podStartSLOduration=1.849892964 podStartE2EDuration="2.32005166s" podCreationTimestamp="2026-03-19 10:58:24 +0000 UTC" firstStartedPulling="2026-03-19 10:58:25.368425292 +0000 UTC m=+2203.717370834" lastFinishedPulling="2026-03-19 10:58:25.838583998 +0000 UTC m=+2204.187529530" observedRunningTime="2026-03-19 10:58:26.312396053 +0000 UTC m=+2204.661341605" watchObservedRunningTime="2026-03-19 10:58:26.32005166 +0000 UTC m=+2204.668997202" Mar 19 10:58:31 crc kubenswrapper[4765]: I0319 10:58:31.656634 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 10:58:31 crc kubenswrapper[4765]: I0319 10:58:31.657023 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 10:58:31 crc kubenswrapper[4765]: I0319 10:58:31.657076 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 10:58:31 crc kubenswrapper[4765]: I0319 10:58:31.657869 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ab93876b58cf4a5eb3ff787b278638a8ff1606d52b12f8abdbd4265ceb51f06d"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 10:58:31 crc kubenswrapper[4765]: I0319 10:58:31.657926 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://ab93876b58cf4a5eb3ff787b278638a8ff1606d52b12f8abdbd4265ceb51f06d" gracePeriod=600 Mar 19 10:58:32 crc kubenswrapper[4765]: I0319 10:58:32.375723 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="ab93876b58cf4a5eb3ff787b278638a8ff1606d52b12f8abdbd4265ceb51f06d" exitCode=0 Mar 19 10:58:32 crc kubenswrapper[4765]: I0319 10:58:32.375774 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"ab93876b58cf4a5eb3ff787b278638a8ff1606d52b12f8abdbd4265ceb51f06d"} Mar 19 10:58:32 crc kubenswrapper[4765]: I0319 10:58:32.376333 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903"} Mar 19 10:58:32 crc kubenswrapper[4765]: I0319 10:58:32.376378 4765 scope.go:117] "RemoveContainer" containerID="2c32c7fc2174982c1b92b864f4fbd39de55b747cb6d0551a9875773b86ef3984" Mar 19 10:59:00 crc kubenswrapper[4765]: I0319 10:59:00.546982 4765 scope.go:117] "RemoveContainer" containerID="1449aac9dcfd24463c7390cacd543a0362a3d57e756744468f966b2a950137ae" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.157882 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6"] Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.161404 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.165091 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.166050 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.171902 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6"] Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.196433 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565300-9bfw7"] Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.197679 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-9bfw7" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.201482 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.201606 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.201488 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.210808 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565300-9bfw7"] Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.241230 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dddf5284-7193-4044-a268-6b5df23415bd-config-volume\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.241294 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dddf5284-7193-4044-a268-6b5df23415bd-secret-volume\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.241568 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fjczq\" (UniqueName: \"kubernetes.io/projected/0f824049-f8e9-4909-836d-2bb5ac32f722-kube-api-access-fjczq\") pod \"auto-csr-approver-29565300-9bfw7\" (UID: \"0f824049-f8e9-4909-836d-2bb5ac32f722\") " pod="openshift-infra/auto-csr-approver-29565300-9bfw7" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.241795 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grf22\" (UniqueName: \"kubernetes.io/projected/dddf5284-7193-4044-a268-6b5df23415bd-kube-api-access-grf22\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.344006 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grf22\" (UniqueName: \"kubernetes.io/projected/dddf5284-7193-4044-a268-6b5df23415bd-kube-api-access-grf22\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.344080 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dddf5284-7193-4044-a268-6b5df23415bd-config-volume\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.344108 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dddf5284-7193-4044-a268-6b5df23415bd-secret-volume\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.344209 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fjczq\" (UniqueName: \"kubernetes.io/projected/0f824049-f8e9-4909-836d-2bb5ac32f722-kube-api-access-fjczq\") pod \"auto-csr-approver-29565300-9bfw7\" (UID: \"0f824049-f8e9-4909-836d-2bb5ac32f722\") " pod="openshift-infra/auto-csr-approver-29565300-9bfw7" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.345295 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dddf5284-7193-4044-a268-6b5df23415bd-config-volume\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.355205 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dddf5284-7193-4044-a268-6b5df23415bd-secret-volume\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.361719 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fjczq\" (UniqueName: \"kubernetes.io/projected/0f824049-f8e9-4909-836d-2bb5ac32f722-kube-api-access-fjczq\") pod \"auto-csr-approver-29565300-9bfw7\" (UID: \"0f824049-f8e9-4909-836d-2bb5ac32f722\") " pod="openshift-infra/auto-csr-approver-29565300-9bfw7" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.362260 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grf22\" (UniqueName: \"kubernetes.io/projected/dddf5284-7193-4044-a268-6b5df23415bd-kube-api-access-grf22\") pod \"collect-profiles-29565300-8hlw6\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.495126 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.521561 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-9bfw7" Mar 19 11:00:00 crc kubenswrapper[4765]: I0319 11:00:00.947436 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6"] Mar 19 11:00:01 crc kubenswrapper[4765]: I0319 11:00:01.027242 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565300-9bfw7"] Mar 19 11:00:01 crc kubenswrapper[4765]: I0319 11:00:01.175219 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565300-9bfw7" event={"ID":"0f824049-f8e9-4909-836d-2bb5ac32f722","Type":"ContainerStarted","Data":"ac932010296d579657ecc6636622bdfb162a320232cb957e15d8b5a066628e63"} Mar 19 11:00:01 crc kubenswrapper[4765]: I0319 11:00:01.176661 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" event={"ID":"dddf5284-7193-4044-a268-6b5df23415bd","Type":"ContainerStarted","Data":"12194277cd46f22bc26bcad6298ca0c1733e04a683c5f7de946c8512ae243327"} Mar 19 11:00:01 crc kubenswrapper[4765]: I0319 11:00:01.176713 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" event={"ID":"dddf5284-7193-4044-a268-6b5df23415bd","Type":"ContainerStarted","Data":"c494521764eefbe771f37dd9451847a83498a7eb4b63219b75c550f04f23898e"} Mar 19 11:00:01 crc kubenswrapper[4765]: I0319 11:00:01.200403 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" podStartSLOduration=1.20037837 podStartE2EDuration="1.20037837s" podCreationTimestamp="2026-03-19 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:00:01.191681105 +0000 UTC m=+2299.540626667" watchObservedRunningTime="2026-03-19 11:00:01.20037837 +0000 UTC m=+2299.549323922" Mar 19 11:00:02 crc kubenswrapper[4765]: I0319 11:00:02.187398 4765 generic.go:334] "Generic (PLEG): container finished" podID="dddf5284-7193-4044-a268-6b5df23415bd" containerID="12194277cd46f22bc26bcad6298ca0c1733e04a683c5f7de946c8512ae243327" exitCode=0 Mar 19 11:00:02 crc kubenswrapper[4765]: I0319 11:00:02.187494 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" event={"ID":"dddf5284-7193-4044-a268-6b5df23415bd","Type":"ContainerDied","Data":"12194277cd46f22bc26bcad6298ca0c1733e04a683c5f7de946c8512ae243327"} Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.528389 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.721572 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dddf5284-7193-4044-a268-6b5df23415bd-config-volume\") pod \"dddf5284-7193-4044-a268-6b5df23415bd\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.722185 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dddf5284-7193-4044-a268-6b5df23415bd-secret-volume\") pod \"dddf5284-7193-4044-a268-6b5df23415bd\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.722741 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-grf22\" (UniqueName: \"kubernetes.io/projected/dddf5284-7193-4044-a268-6b5df23415bd-kube-api-access-grf22\") pod \"dddf5284-7193-4044-a268-6b5df23415bd\" (UID: \"dddf5284-7193-4044-a268-6b5df23415bd\") " Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.722923 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/dddf5284-7193-4044-a268-6b5df23415bd-config-volume" (OuterVolumeSpecName: "config-volume") pod "dddf5284-7193-4044-a268-6b5df23415bd" (UID: "dddf5284-7193-4044-a268-6b5df23415bd"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.723854 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/dddf5284-7193-4044-a268-6b5df23415bd-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.728089 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dddf5284-7193-4044-a268-6b5df23415bd-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "dddf5284-7193-4044-a268-6b5df23415bd" (UID: "dddf5284-7193-4044-a268-6b5df23415bd"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.730188 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dddf5284-7193-4044-a268-6b5df23415bd-kube-api-access-grf22" (OuterVolumeSpecName: "kube-api-access-grf22") pod "dddf5284-7193-4044-a268-6b5df23415bd" (UID: "dddf5284-7193-4044-a268-6b5df23415bd"). InnerVolumeSpecName "kube-api-access-grf22". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.824719 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-grf22\" (UniqueName: \"kubernetes.io/projected/dddf5284-7193-4044-a268-6b5df23415bd-kube-api-access-grf22\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:03 crc kubenswrapper[4765]: I0319 11:00:03.824766 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/dddf5284-7193-4044-a268-6b5df23415bd-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:04 crc kubenswrapper[4765]: I0319 11:00:04.204669 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" event={"ID":"dddf5284-7193-4044-a268-6b5df23415bd","Type":"ContainerDied","Data":"c494521764eefbe771f37dd9451847a83498a7eb4b63219b75c550f04f23898e"} Mar 19 11:00:04 crc kubenswrapper[4765]: I0319 11:00:04.204710 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c494521764eefbe771f37dd9451847a83498a7eb4b63219b75c550f04f23898e" Mar 19 11:00:04 crc kubenswrapper[4765]: I0319 11:00:04.204771 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565300-8hlw6" Mar 19 11:00:04 crc kubenswrapper[4765]: I0319 11:00:04.264865 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v"] Mar 19 11:00:04 crc kubenswrapper[4765]: I0319 11:00:04.272185 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565255-jmv4v"] Mar 19 11:00:04 crc kubenswrapper[4765]: I0319 11:00:04.373362 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b8a97d83-18b0-42eb-9ed9-f49ffff3d034" path="/var/lib/kubelet/pods/b8a97d83-18b0-42eb-9ed9-f49ffff3d034/volumes" Mar 19 11:00:05 crc kubenswrapper[4765]: I0319 11:00:05.214751 4765 generic.go:334] "Generic (PLEG): container finished" podID="0f824049-f8e9-4909-836d-2bb5ac32f722" containerID="eff6eca1fe5d0d1e0b4882c99d4314a795561f5a0400bb936acd85ec1e3dc4cf" exitCode=0 Mar 19 11:00:05 crc kubenswrapper[4765]: I0319 11:00:05.214813 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565300-9bfw7" event={"ID":"0f824049-f8e9-4909-836d-2bb5ac32f722","Type":"ContainerDied","Data":"eff6eca1fe5d0d1e0b4882c99d4314a795561f5a0400bb936acd85ec1e3dc4cf"} Mar 19 11:00:06 crc kubenswrapper[4765]: I0319 11:00:06.626521 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-9bfw7" Mar 19 11:00:06 crc kubenswrapper[4765]: I0319 11:00:06.782237 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fjczq\" (UniqueName: \"kubernetes.io/projected/0f824049-f8e9-4909-836d-2bb5ac32f722-kube-api-access-fjczq\") pod \"0f824049-f8e9-4909-836d-2bb5ac32f722\" (UID: \"0f824049-f8e9-4909-836d-2bb5ac32f722\") " Mar 19 11:00:06 crc kubenswrapper[4765]: I0319 11:00:06.788523 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0f824049-f8e9-4909-836d-2bb5ac32f722-kube-api-access-fjczq" (OuterVolumeSpecName: "kube-api-access-fjczq") pod "0f824049-f8e9-4909-836d-2bb5ac32f722" (UID: "0f824049-f8e9-4909-836d-2bb5ac32f722"). InnerVolumeSpecName "kube-api-access-fjczq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:00:06 crc kubenswrapper[4765]: I0319 11:00:06.884671 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fjczq\" (UniqueName: \"kubernetes.io/projected/0f824049-f8e9-4909-836d-2bb5ac32f722-kube-api-access-fjczq\") on node \"crc\" DevicePath \"\"" Mar 19 11:00:07 crc kubenswrapper[4765]: I0319 11:00:07.232512 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565300-9bfw7" event={"ID":"0f824049-f8e9-4909-836d-2bb5ac32f722","Type":"ContainerDied","Data":"ac932010296d579657ecc6636622bdfb162a320232cb957e15d8b5a066628e63"} Mar 19 11:00:07 crc kubenswrapper[4765]: I0319 11:00:07.232549 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac932010296d579657ecc6636622bdfb162a320232cb957e15d8b5a066628e63" Mar 19 11:00:07 crc kubenswrapper[4765]: I0319 11:00:07.232554 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565300-9bfw7" Mar 19 11:00:07 crc kubenswrapper[4765]: I0319 11:00:07.695680 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-ggg2c"] Mar 19 11:00:07 crc kubenswrapper[4765]: I0319 11:00:07.706141 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565294-ggg2c"] Mar 19 11:00:08 crc kubenswrapper[4765]: I0319 11:00:08.368125 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d2bbbd3f-dbde-41f4-ad9a-9186d51640e1" path="/var/lib/kubelet/pods/d2bbbd3f-dbde-41f4-ad9a-9186d51640e1/volumes" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.159099 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/keystone-cron-29565301-lwwfz"] Mar 19 11:01:00 crc kubenswrapper[4765]: E0319 11:01:00.160067 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="dddf5284-7193-4044-a268-6b5df23415bd" containerName="collect-profiles" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.160084 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="dddf5284-7193-4044-a268-6b5df23415bd" containerName="collect-profiles" Mar 19 11:01:00 crc kubenswrapper[4765]: E0319 11:01:00.160112 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0f824049-f8e9-4909-836d-2bb5ac32f722" containerName="oc" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.160119 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0f824049-f8e9-4909-836d-2bb5ac32f722" containerName="oc" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.160297 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="dddf5284-7193-4044-a268-6b5df23415bd" containerName="collect-profiles" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.160315 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0f824049-f8e9-4909-836d-2bb5ac32f722" containerName="oc" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.160933 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.174041 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565301-lwwfz"] Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.303980 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b2dcv\" (UniqueName: \"kubernetes.io/projected/760a6c55-471d-478e-b75a-713476259c81-kube-api-access-b2dcv\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.304123 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-combined-ca-bundle\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.304186 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-fernet-keys\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.304254 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-config-data\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.407388 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-fernet-keys\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.408062 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-config-data\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.408173 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b2dcv\" (UniqueName: \"kubernetes.io/projected/760a6c55-471d-478e-b75a-713476259c81-kube-api-access-b2dcv\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.408517 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-combined-ca-bundle\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.416809 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-fernet-keys\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.417425 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-combined-ca-bundle\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.418565 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-config-data\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.426800 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b2dcv\" (UniqueName: \"kubernetes.io/projected/760a6c55-471d-478e-b75a-713476259c81-kube-api-access-b2dcv\") pod \"keystone-cron-29565301-lwwfz\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.497011 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.665036 4765 scope.go:117] "RemoveContainer" containerID="c2a3fc499763766084e1f741340244299ced9cfe6bea53084e18f4fc9d4e9a8a" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.701334 4765 scope.go:117] "RemoveContainer" containerID="ea782b2e67066acf7cb5bd5dc88e3bc505ce123975f183eb849fc31610a22820" Mar 19 11:01:00 crc kubenswrapper[4765]: I0319 11:01:00.997359 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/keystone-cron-29565301-lwwfz"] Mar 19 11:01:01 crc kubenswrapper[4765]: I0319 11:01:01.656976 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:01:01 crc kubenswrapper[4765]: I0319 11:01:01.657587 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:01:01 crc kubenswrapper[4765]: I0319 11:01:01.713612 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-lwwfz" event={"ID":"760a6c55-471d-478e-b75a-713476259c81","Type":"ContainerStarted","Data":"6843bd5d3bb00158f8afcf72111088df8762c8e74b3bd09234a8baf09cd7f913"} Mar 19 11:01:01 crc kubenswrapper[4765]: I0319 11:01:01.713664 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-lwwfz" event={"ID":"760a6c55-471d-478e-b75a-713476259c81","Type":"ContainerStarted","Data":"f2a9a2459a662e1426b9093225579ede70082f90eaf1158d4f2fcba1154af164"} Mar 19 11:01:01 crc kubenswrapper[4765]: I0319 11:01:01.729335 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/keystone-cron-29565301-lwwfz" podStartSLOduration=1.7293118760000001 podStartE2EDuration="1.729311876s" podCreationTimestamp="2026-03-19 11:01:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:01:01.729038389 +0000 UTC m=+2360.077983951" watchObservedRunningTime="2026-03-19 11:01:01.729311876 +0000 UTC m=+2360.078257418" Mar 19 11:01:03 crc kubenswrapper[4765]: I0319 11:01:03.731018 4765 generic.go:334] "Generic (PLEG): container finished" podID="760a6c55-471d-478e-b75a-713476259c81" containerID="6843bd5d3bb00158f8afcf72111088df8762c8e74b3bd09234a8baf09cd7f913" exitCode=0 Mar 19 11:01:03 crc kubenswrapper[4765]: I0319 11:01:03.731325 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-lwwfz" event={"ID":"760a6c55-471d-478e-b75a-713476259c81","Type":"ContainerDied","Data":"6843bd5d3bb00158f8afcf72111088df8762c8e74b3bd09234a8baf09cd7f913"} Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.069358 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.235139 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-config-data\") pod \"760a6c55-471d-478e-b75a-713476259c81\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.235194 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-combined-ca-bundle\") pod \"760a6c55-471d-478e-b75a-713476259c81\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.235372 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b2dcv\" (UniqueName: \"kubernetes.io/projected/760a6c55-471d-478e-b75a-713476259c81-kube-api-access-b2dcv\") pod \"760a6c55-471d-478e-b75a-713476259c81\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.235391 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-fernet-keys\") pod \"760a6c55-471d-478e-b75a-713476259c81\" (UID: \"760a6c55-471d-478e-b75a-713476259c81\") " Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.242694 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "760a6c55-471d-478e-b75a-713476259c81" (UID: "760a6c55-471d-478e-b75a-713476259c81"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.244594 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/760a6c55-471d-478e-b75a-713476259c81-kube-api-access-b2dcv" (OuterVolumeSpecName: "kube-api-access-b2dcv") pod "760a6c55-471d-478e-b75a-713476259c81" (UID: "760a6c55-471d-478e-b75a-713476259c81"). InnerVolumeSpecName "kube-api-access-b2dcv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.276090 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "760a6c55-471d-478e-b75a-713476259c81" (UID: "760a6c55-471d-478e-b75a-713476259c81"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.293559 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-config-data" (OuterVolumeSpecName: "config-data") pod "760a6c55-471d-478e-b75a-713476259c81" (UID: "760a6c55-471d-478e-b75a-713476259c81"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.338229 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b2dcv\" (UniqueName: \"kubernetes.io/projected/760a6c55-471d-478e-b75a-713476259c81-kube-api-access-b2dcv\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.338271 4765 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-fernet-keys\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.338284 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.338292 4765 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/760a6c55-471d-478e-b75a-713476259c81-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.749586 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/keystone-cron-29565301-lwwfz" event={"ID":"760a6c55-471d-478e-b75a-713476259c81","Type":"ContainerDied","Data":"f2a9a2459a662e1426b9093225579ede70082f90eaf1158d4f2fcba1154af164"} Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.749633 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f2a9a2459a662e1426b9093225579ede70082f90eaf1158d4f2fcba1154af164" Mar 19 11:01:05 crc kubenswrapper[4765]: I0319 11:01:05.749688 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/keystone-cron-29565301-lwwfz" Mar 19 11:01:31 crc kubenswrapper[4765]: I0319 11:01:31.656114 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:01:31 crc kubenswrapper[4765]: I0319 11:01:31.656672 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:01:58 crc kubenswrapper[4765]: I0319 11:01:58.236713 4765 generic.go:334] "Generic (PLEG): container finished" podID="895f9304-5267-4b0b-acac-7e0d279b8866" containerID="ec61750926bbc5a07485d406cf4fb854ef984294b86c17dacaab1c96d923f95c" exitCode=0 Mar 19 11:01:58 crc kubenswrapper[4765]: I0319 11:01:58.236801 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" event={"ID":"895f9304-5267-4b0b-acac-7e0d279b8866","Type":"ContainerDied","Data":"ec61750926bbc5a07485d406cf4fb854ef984294b86c17dacaab1c96d923f95c"} Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.669418 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.864851 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mqms8\" (UniqueName: \"kubernetes.io/projected/895f9304-5267-4b0b-acac-7e0d279b8866-kube-api-access-mqms8\") pod \"895f9304-5267-4b0b-acac-7e0d279b8866\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.865147 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-inventory\") pod \"895f9304-5267-4b0b-acac-7e0d279b8866\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.865366 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-secret-0\") pod \"895f9304-5267-4b0b-acac-7e0d279b8866\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.865450 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-combined-ca-bundle\") pod \"895f9304-5267-4b0b-acac-7e0d279b8866\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.865491 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-ssh-key-openstack-edpm-ipam\") pod \"895f9304-5267-4b0b-acac-7e0d279b8866\" (UID: \"895f9304-5267-4b0b-acac-7e0d279b8866\") " Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.871156 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-combined-ca-bundle" (OuterVolumeSpecName: "libvirt-combined-ca-bundle") pod "895f9304-5267-4b0b-acac-7e0d279b8866" (UID: "895f9304-5267-4b0b-acac-7e0d279b8866"). InnerVolumeSpecName "libvirt-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.871215 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/895f9304-5267-4b0b-acac-7e0d279b8866-kube-api-access-mqms8" (OuterVolumeSpecName: "kube-api-access-mqms8") pod "895f9304-5267-4b0b-acac-7e0d279b8866" (UID: "895f9304-5267-4b0b-acac-7e0d279b8866"). InnerVolumeSpecName "kube-api-access-mqms8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.897467 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-secret-0" (OuterVolumeSpecName: "libvirt-secret-0") pod "895f9304-5267-4b0b-acac-7e0d279b8866" (UID: "895f9304-5267-4b0b-acac-7e0d279b8866"). InnerVolumeSpecName "libvirt-secret-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.898585 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-inventory" (OuterVolumeSpecName: "inventory") pod "895f9304-5267-4b0b-acac-7e0d279b8866" (UID: "895f9304-5267-4b0b-acac-7e0d279b8866"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.898816 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "895f9304-5267-4b0b-acac-7e0d279b8866" (UID: "895f9304-5267-4b0b-acac-7e0d279b8866"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.968347 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mqms8\" (UniqueName: \"kubernetes.io/projected/895f9304-5267-4b0b-acac-7e0d279b8866-kube-api-access-mqms8\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.968387 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.968400 4765 reconciler_common.go:293] "Volume detached for volume \"libvirt-secret-0\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-secret-0\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.968414 4765 reconciler_common.go:293] "Volume detached for volume \"libvirt-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-libvirt-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 11:01:59 crc kubenswrapper[4765]: I0319 11:01:59.968428 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/895f9304-5267-4b0b-acac-7e0d279b8866-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.148484 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565302-h4fq5"] Mar 19 11:02:00 crc kubenswrapper[4765]: E0319 11:02:00.149221 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="760a6c55-471d-478e-b75a-713476259c81" containerName="keystone-cron" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.149246 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="760a6c55-471d-478e-b75a-713476259c81" containerName="keystone-cron" Mar 19 11:02:00 crc kubenswrapper[4765]: E0319 11:02:00.149280 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="895f9304-5267-4b0b-acac-7e0d279b8866" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.149291 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="895f9304-5267-4b0b-acac-7e0d279b8866" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.149495 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="760a6c55-471d-478e-b75a-713476259c81" containerName="keystone-cron" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.149538 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="895f9304-5267-4b0b-acac-7e0d279b8866" containerName="libvirt-edpm-deployment-openstack-edpm-ipam" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.150366 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.153230 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.154090 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.166663 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.169528 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565302-h4fq5"] Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.257466 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" event={"ID":"895f9304-5267-4b0b-acac-7e0d279b8866","Type":"ContainerDied","Data":"24b1d53baff6aa6430de704d04b517e9cf8a4de85ba7fafee6a26a5fdeeda7a6"} Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.257508 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="24b1d53baff6aa6430de704d04b517e9cf8a4de85ba7fafee6a26a5fdeeda7a6" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.257565 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/libvirt-edpm-deployment-openstack-edpm-ipam-kplgb" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.273394 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pbgpj\" (UniqueName: \"kubernetes.io/projected/15bb417d-3ab4-44df-a950-bbd7c17b289a-kube-api-access-pbgpj\") pod \"auto-csr-approver-29565302-h4fq5\" (UID: \"15bb417d-3ab4-44df-a950-bbd7c17b289a\") " pod="openshift-infra/auto-csr-approver-29565302-h4fq5" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.354665 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9"] Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.356129 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.358371 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.359053 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"nova-extra-config" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.359074 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-cell1-compute-config" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.359081 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.359079 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.360244 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"nova-migration-ssh-key" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.360521 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.375044 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pbgpj\" (UniqueName: \"kubernetes.io/projected/15bb417d-3ab4-44df-a950-bbd7c17b289a-kube-api-access-pbgpj\") pod \"auto-csr-approver-29565302-h4fq5\" (UID: \"15bb417d-3ab4-44df-a950-bbd7c17b289a\") " pod="openshift-infra/auto-csr-approver-29565302-h4fq5" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.385905 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9"] Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.408749 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pbgpj\" (UniqueName: \"kubernetes.io/projected/15bb417d-3ab4-44df-a950-bbd7c17b289a-kube-api-access-pbgpj\") pod \"auto-csr-approver-29565302-h4fq5\" (UID: \"15bb417d-3ab4-44df-a950-bbd7c17b289a\") " pod="openshift-infra/auto-csr-approver-29565302-h4fq5" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.476033 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.476887 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.476950 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477052 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477126 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477178 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477220 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477431 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477532 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477569 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477634 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cjvgd\" (UniqueName: \"kubernetes.io/projected/f9cf075c-03d2-4254-9ab9-5500d4f42186-kube-api-access-cjvgd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.477753 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579100 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579405 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579435 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579460 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579491 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cjvgd\" (UniqueName: \"kubernetes.io/projected/f9cf075c-03d2-4254-9ab9-5500d4f42186-kube-api-access-cjvgd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579530 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579583 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579604 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579636 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579663 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.579684 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.583381 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-extra-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.587935 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.588264 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-2\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.591255 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-inventory\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.591765 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-ssh-key-openstack-edpm-ipam\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.591774 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.597704 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-3\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.599037 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-combined-ca-bundle\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.599288 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-0\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.601953 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-1\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.602257 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cjvgd\" (UniqueName: \"kubernetes.io/projected/f9cf075c-03d2-4254-9ab9-5500d4f42186-kube-api-access-cjvgd\") pod \"nova-edpm-deployment-openstack-edpm-ipam-skpp9\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.690490 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:02:00 crc kubenswrapper[4765]: I0319 11:02:00.944639 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565302-h4fq5"] Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.242946 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9"] Mar 19 11:02:01 crc kubenswrapper[4765]: W0319 11:02:01.244329 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf9cf075c_03d2_4254_9ab9_5500d4f42186.slice/crio-8d10a35e176a91c6d0b8644f2ae5509030b79a556c7e24de96d0f1c6b5664192 WatchSource:0}: Error finding container 8d10a35e176a91c6d0b8644f2ae5509030b79a556c7e24de96d0f1c6b5664192: Status 404 returned error can't find the container with id 8d10a35e176a91c6d0b8644f2ae5509030b79a556c7e24de96d0f1c6b5664192 Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.268163 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" event={"ID":"f9cf075c-03d2-4254-9ab9-5500d4f42186","Type":"ContainerStarted","Data":"8d10a35e176a91c6d0b8644f2ae5509030b79a556c7e24de96d0f1c6b5664192"} Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.269677 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" event={"ID":"15bb417d-3ab4-44df-a950-bbd7c17b289a","Type":"ContainerStarted","Data":"e9961ba7bc7d1051c705cbc982f204b0564050da91f8fb075aaf2b33a6ae2f21"} Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.656486 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.656851 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.656917 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.657808 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 11:02:01 crc kubenswrapper[4765]: I0319 11:02:01.657888 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" gracePeriod=600 Mar 19 11:02:01 crc kubenswrapper[4765]: E0319 11:02:01.782943 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:02:02 crc kubenswrapper[4765]: I0319 11:02:02.282178 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" exitCode=0 Mar 19 11:02:02 crc kubenswrapper[4765]: I0319 11:02:02.282258 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903"} Mar 19 11:02:02 crc kubenswrapper[4765]: I0319 11:02:02.282449 4765 scope.go:117] "RemoveContainer" containerID="ab93876b58cf4a5eb3ff787b278638a8ff1606d52b12f8abdbd4265ceb51f06d" Mar 19 11:02:02 crc kubenswrapper[4765]: I0319 11:02:02.283055 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:02:02 crc kubenswrapper[4765]: E0319 11:02:02.283331 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:02:02 crc kubenswrapper[4765]: I0319 11:02:02.292262 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" event={"ID":"15bb417d-3ab4-44df-a950-bbd7c17b289a","Type":"ContainerStarted","Data":"b9b178bcde609482bc72b19cd7b52e1399b2ddad472355f66e1f65a67d234566"} Mar 19 11:02:02 crc kubenswrapper[4765]: I0319 11:02:02.319643 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" podStartSLOduration=1.38997314 podStartE2EDuration="2.319620972s" podCreationTimestamp="2026-03-19 11:02:00 +0000 UTC" firstStartedPulling="2026-03-19 11:02:00.950280049 +0000 UTC m=+2419.299225591" lastFinishedPulling="2026-03-19 11:02:01.879927891 +0000 UTC m=+2420.228873423" observedRunningTime="2026-03-19 11:02:02.316734614 +0000 UTC m=+2420.665680146" watchObservedRunningTime="2026-03-19 11:02:02.319620972 +0000 UTC m=+2420.668566514" Mar 19 11:02:03 crc kubenswrapper[4765]: I0319 11:02:03.305000 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" event={"ID":"f9cf075c-03d2-4254-9ab9-5500d4f42186","Type":"ContainerStarted","Data":"ad6a35415cfed9684eaf139936e0b0a62c54b279abfc24e8bbea14e59dde6355"} Mar 19 11:02:03 crc kubenswrapper[4765]: I0319 11:02:03.307307 4765 generic.go:334] "Generic (PLEG): container finished" podID="15bb417d-3ab4-44df-a950-bbd7c17b289a" containerID="b9b178bcde609482bc72b19cd7b52e1399b2ddad472355f66e1f65a67d234566" exitCode=0 Mar 19 11:02:03 crc kubenswrapper[4765]: I0319 11:02:03.307511 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" event={"ID":"15bb417d-3ab4-44df-a950-bbd7c17b289a","Type":"ContainerDied","Data":"b9b178bcde609482bc72b19cd7b52e1399b2ddad472355f66e1f65a67d234566"} Mar 19 11:02:03 crc kubenswrapper[4765]: I0319 11:02:03.333358 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" podStartSLOduration=2.107019723 podStartE2EDuration="3.333336529s" podCreationTimestamp="2026-03-19 11:02:00 +0000 UTC" firstStartedPulling="2026-03-19 11:02:01.247470036 +0000 UTC m=+2419.596415578" lastFinishedPulling="2026-03-19 11:02:02.473786842 +0000 UTC m=+2420.822732384" observedRunningTime="2026-03-19 11:02:03.3285752 +0000 UTC m=+2421.677520772" watchObservedRunningTime="2026-03-19 11:02:03.333336529 +0000 UTC m=+2421.682282071" Mar 19 11:02:04 crc kubenswrapper[4765]: I0319 11:02:04.683075 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" Mar 19 11:02:04 crc kubenswrapper[4765]: I0319 11:02:04.796014 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pbgpj\" (UniqueName: \"kubernetes.io/projected/15bb417d-3ab4-44df-a950-bbd7c17b289a-kube-api-access-pbgpj\") pod \"15bb417d-3ab4-44df-a950-bbd7c17b289a\" (UID: \"15bb417d-3ab4-44df-a950-bbd7c17b289a\") " Mar 19 11:02:04 crc kubenswrapper[4765]: I0319 11:02:04.802596 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15bb417d-3ab4-44df-a950-bbd7c17b289a-kube-api-access-pbgpj" (OuterVolumeSpecName: "kube-api-access-pbgpj") pod "15bb417d-3ab4-44df-a950-bbd7c17b289a" (UID: "15bb417d-3ab4-44df-a950-bbd7c17b289a"). InnerVolumeSpecName "kube-api-access-pbgpj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:02:04 crc kubenswrapper[4765]: I0319 11:02:04.899018 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pbgpj\" (UniqueName: \"kubernetes.io/projected/15bb417d-3ab4-44df-a950-bbd7c17b289a-kube-api-access-pbgpj\") on node \"crc\" DevicePath \"\"" Mar 19 11:02:05 crc kubenswrapper[4765]: I0319 11:02:05.337678 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" event={"ID":"15bb417d-3ab4-44df-a950-bbd7c17b289a","Type":"ContainerDied","Data":"e9961ba7bc7d1051c705cbc982f204b0564050da91f8fb075aaf2b33a6ae2f21"} Mar 19 11:02:05 crc kubenswrapper[4765]: I0319 11:02:05.338067 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e9961ba7bc7d1051c705cbc982f204b0564050da91f8fb075aaf2b33a6ae2f21" Mar 19 11:02:05 crc kubenswrapper[4765]: I0319 11:02:05.337751 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565302-h4fq5" Mar 19 11:02:05 crc kubenswrapper[4765]: I0319 11:02:05.390798 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565296-jm5j9"] Mar 19 11:02:05 crc kubenswrapper[4765]: I0319 11:02:05.398309 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565296-jm5j9"] Mar 19 11:02:06 crc kubenswrapper[4765]: I0319 11:02:06.367860 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b27cbdf-9041-4305-a969-586e8b4d09b4" path="/var/lib/kubelet/pods/1b27cbdf-9041-4305-a969-586e8b4d09b4/volumes" Mar 19 11:02:17 crc kubenswrapper[4765]: I0319 11:02:17.357293 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:02:17 crc kubenswrapper[4765]: E0319 11:02:17.358673 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:02:32 crc kubenswrapper[4765]: I0319 11:02:32.363845 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:02:32 crc kubenswrapper[4765]: E0319 11:02:32.364992 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:02:44 crc kubenswrapper[4765]: I0319 11:02:44.356222 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:02:44 crc kubenswrapper[4765]: E0319 11:02:44.357292 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.356752 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:02:57 crc kubenswrapper[4765]: E0319 11:02:57.357633 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.540739 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-krwfw"] Mar 19 11:02:57 crc kubenswrapper[4765]: E0319 11:02:57.541208 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15bb417d-3ab4-44df-a950-bbd7c17b289a" containerName="oc" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.541232 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="15bb417d-3ab4-44df-a950-bbd7c17b289a" containerName="oc" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.541441 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="15bb417d-3ab4-44df-a950-bbd7c17b289a" containerName="oc" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.542922 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.561166 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krwfw"] Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.599826 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-kube-api-access-vkrs7\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.599882 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-catalog-content\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.600032 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-utilities\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.702474 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-kube-api-access-vkrs7\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.702552 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-catalog-content\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.702633 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-utilities\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.703238 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-catalog-content\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.703254 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-utilities\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.723652 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-kube-api-access-vkrs7\") pod \"redhat-marketplace-krwfw\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:57 crc kubenswrapper[4765]: I0319 11:02:57.907925 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:02:58 crc kubenswrapper[4765]: I0319 11:02:58.372745 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-krwfw"] Mar 19 11:02:58 crc kubenswrapper[4765]: I0319 11:02:58.847696 4765 generic.go:334] "Generic (PLEG): container finished" podID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerID="2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950" exitCode=0 Mar 19 11:02:58 crc kubenswrapper[4765]: I0319 11:02:58.847864 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krwfw" event={"ID":"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a","Type":"ContainerDied","Data":"2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950"} Mar 19 11:02:58 crc kubenswrapper[4765]: I0319 11:02:58.848032 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krwfw" event={"ID":"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a","Type":"ContainerStarted","Data":"8e278f9ceb2f7043c5311b9556bf3b84f1ad624ec6d36119763b847cea65e493"} Mar 19 11:03:00 crc kubenswrapper[4765]: I0319 11:03:00.857443 4765 scope.go:117] "RemoveContainer" containerID="d92d543c3ed27b928edd3cf54bedff8a1dbc17673c4568aac83602be8ee7af84" Mar 19 11:03:00 crc kubenswrapper[4765]: I0319 11:03:00.874451 4765 generic.go:334] "Generic (PLEG): container finished" podID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerID="070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1" exitCode=0 Mar 19 11:03:00 crc kubenswrapper[4765]: I0319 11:03:00.874546 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krwfw" event={"ID":"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a","Type":"ContainerDied","Data":"070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1"} Mar 19 11:03:00 crc kubenswrapper[4765]: I0319 11:03:00.917606 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:03:01 crc kubenswrapper[4765]: I0319 11:03:01.888458 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krwfw" event={"ID":"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a","Type":"ContainerStarted","Data":"669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a"} Mar 19 11:03:01 crc kubenswrapper[4765]: I0319 11:03:01.921812 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-krwfw" podStartSLOduration=2.414725297 podStartE2EDuration="4.921780929s" podCreationTimestamp="2026-03-19 11:02:57 +0000 UTC" firstStartedPulling="2026-03-19 11:02:58.849709186 +0000 UTC m=+2477.198654738" lastFinishedPulling="2026-03-19 11:03:01.356764838 +0000 UTC m=+2479.705710370" observedRunningTime="2026-03-19 11:03:01.916209559 +0000 UTC m=+2480.265155121" watchObservedRunningTime="2026-03-19 11:03:01.921780929 +0000 UTC m=+2480.270726501" Mar 19 11:03:07 crc kubenswrapper[4765]: I0319 11:03:07.908397 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:03:07 crc kubenswrapper[4765]: I0319 11:03:07.909081 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:03:07 crc kubenswrapper[4765]: I0319 11:03:07.953574 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:03:08 crc kubenswrapper[4765]: I0319 11:03:08.006094 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:03:08 crc kubenswrapper[4765]: I0319 11:03:08.196987 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krwfw"] Mar 19 11:03:09 crc kubenswrapper[4765]: I0319 11:03:09.357243 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:03:09 crc kubenswrapper[4765]: E0319 11:03:09.357881 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:03:09 crc kubenswrapper[4765]: I0319 11:03:09.967082 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-krwfw" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="registry-server" containerID="cri-o://669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a" gracePeriod=2 Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.391844 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.471835 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-catalog-content\") pod \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.471876 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-utilities\") pod \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.472213 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-kube-api-access-vkrs7\") pod \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\" (UID: \"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a\") " Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.473289 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-utilities" (OuterVolumeSpecName: "utilities") pod "d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" (UID: "d7d37ee5-8b10-41b9-a017-3b1b5c928f7a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.478596 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-kube-api-access-vkrs7" (OuterVolumeSpecName: "kube-api-access-vkrs7") pod "d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" (UID: "d7d37ee5-8b10-41b9-a017-3b1b5c928f7a"). InnerVolumeSpecName "kube-api-access-vkrs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.512407 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" (UID: "d7d37ee5-8b10-41b9-a017-3b1b5c928f7a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.575754 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkrs7\" (UniqueName: \"kubernetes.io/projected/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-kube-api-access-vkrs7\") on node \"crc\" DevicePath \"\"" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.575795 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.575808 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.978032 4765 generic.go:334] "Generic (PLEG): container finished" podID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerID="669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a" exitCode=0 Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.978136 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-krwfw" Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.978168 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krwfw" event={"ID":"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a","Type":"ContainerDied","Data":"669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a"} Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.978468 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-krwfw" event={"ID":"d7d37ee5-8b10-41b9-a017-3b1b5c928f7a","Type":"ContainerDied","Data":"8e278f9ceb2f7043c5311b9556bf3b84f1ad624ec6d36119763b847cea65e493"} Mar 19 11:03:10 crc kubenswrapper[4765]: I0319 11:03:10.978499 4765 scope.go:117] "RemoveContainer" containerID="669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.021213 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-krwfw"] Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.021678 4765 scope.go:117] "RemoveContainer" containerID="070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.029789 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-krwfw"] Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.045388 4765 scope.go:117] "RemoveContainer" containerID="2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.086713 4765 scope.go:117] "RemoveContainer" containerID="669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a" Mar 19 11:03:11 crc kubenswrapper[4765]: E0319 11:03:11.087150 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a\": container with ID starting with 669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a not found: ID does not exist" containerID="669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.087202 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a"} err="failed to get container status \"669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a\": rpc error: code = NotFound desc = could not find container \"669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a\": container with ID starting with 669dff6ea4e2449af5504aa6f2c048f39255494c42692d6481f6b969bd100c8a not found: ID does not exist" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.087237 4765 scope.go:117] "RemoveContainer" containerID="070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1" Mar 19 11:03:11 crc kubenswrapper[4765]: E0319 11:03:11.087603 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1\": container with ID starting with 070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1 not found: ID does not exist" containerID="070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.087654 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1"} err="failed to get container status \"070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1\": rpc error: code = NotFound desc = could not find container \"070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1\": container with ID starting with 070eb0010e9eebd824c09f4879541ed889e4b2b6d8f07cd4da5194268db1f0f1 not found: ID does not exist" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.087689 4765 scope.go:117] "RemoveContainer" containerID="2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950" Mar 19 11:03:11 crc kubenswrapper[4765]: E0319 11:03:11.088001 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950\": container with ID starting with 2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950 not found: ID does not exist" containerID="2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950" Mar 19 11:03:11 crc kubenswrapper[4765]: I0319 11:03:11.088039 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950"} err="failed to get container status \"2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950\": rpc error: code = NotFound desc = could not find container \"2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950\": container with ID starting with 2649fbb946079e0a2f5bef3e1a0c2acec8e20d29f7c2911ce787eaf9c58bf950 not found: ID does not exist" Mar 19 11:03:12 crc kubenswrapper[4765]: I0319 11:03:12.371656 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" path="/var/lib/kubelet/pods/d7d37ee5-8b10-41b9-a017-3b1b5c928f7a/volumes" Mar 19 11:03:21 crc kubenswrapper[4765]: I0319 11:03:21.358827 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:03:21 crc kubenswrapper[4765]: E0319 11:03:21.360059 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:03:33 crc kubenswrapper[4765]: I0319 11:03:33.356224 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:03:33 crc kubenswrapper[4765]: E0319 11:03:33.357119 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:03:45 crc kubenswrapper[4765]: I0319 11:03:45.356065 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:03:45 crc kubenswrapper[4765]: E0319 11:03:45.356927 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:03:59 crc kubenswrapper[4765]: I0319 11:03:59.360658 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:03:59 crc kubenswrapper[4765]: E0319 11:03:59.362614 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.150448 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565304-mvcqw"] Mar 19 11:04:00 crc kubenswrapper[4765]: E0319 11:04:00.151262 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="extract-utilities" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.151290 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="extract-utilities" Mar 19 11:04:00 crc kubenswrapper[4765]: E0319 11:04:00.151322 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="registry-server" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.151333 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="registry-server" Mar 19 11:04:00 crc kubenswrapper[4765]: E0319 11:04:00.151353 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="extract-content" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.151364 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="extract-content" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.151606 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7d37ee5-8b10-41b9-a017-3b1b5c928f7a" containerName="registry-server" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.154233 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565304-mvcqw" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.158797 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.158838 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.159004 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.162589 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565304-mvcqw"] Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.296549 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpcd7\" (UniqueName: \"kubernetes.io/projected/c8c122d7-d33a-4c98-970e-be77ef4539e9-kube-api-access-fpcd7\") pod \"auto-csr-approver-29565304-mvcqw\" (UID: \"c8c122d7-d33a-4c98-970e-be77ef4539e9\") " pod="openshift-infra/auto-csr-approver-29565304-mvcqw" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.399100 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpcd7\" (UniqueName: \"kubernetes.io/projected/c8c122d7-d33a-4c98-970e-be77ef4539e9-kube-api-access-fpcd7\") pod \"auto-csr-approver-29565304-mvcqw\" (UID: \"c8c122d7-d33a-4c98-970e-be77ef4539e9\") " pod="openshift-infra/auto-csr-approver-29565304-mvcqw" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.422780 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpcd7\" (UniqueName: \"kubernetes.io/projected/c8c122d7-d33a-4c98-970e-be77ef4539e9-kube-api-access-fpcd7\") pod \"auto-csr-approver-29565304-mvcqw\" (UID: \"c8c122d7-d33a-4c98-970e-be77ef4539e9\") " pod="openshift-infra/auto-csr-approver-29565304-mvcqw" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.476016 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565304-mvcqw" Mar 19 11:04:00 crc kubenswrapper[4765]: I0319 11:04:00.967671 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565304-mvcqw"] Mar 19 11:04:01 crc kubenswrapper[4765]: I0319 11:04:01.460533 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565304-mvcqw" event={"ID":"c8c122d7-d33a-4c98-970e-be77ef4539e9","Type":"ContainerStarted","Data":"32db1ae475d0379ac26e12a14105d66d7c69e31935ab5899b51e8c17d8d3842a"} Mar 19 11:04:03 crc kubenswrapper[4765]: I0319 11:04:03.498759 4765 generic.go:334] "Generic (PLEG): container finished" podID="c8c122d7-d33a-4c98-970e-be77ef4539e9" containerID="7f8df658308dc6880e58feb2cacde21ea7c28617357ae46adae88ebbb6da430e" exitCode=0 Mar 19 11:04:03 crc kubenswrapper[4765]: I0319 11:04:03.498871 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565304-mvcqw" event={"ID":"c8c122d7-d33a-4c98-970e-be77ef4539e9","Type":"ContainerDied","Data":"7f8df658308dc6880e58feb2cacde21ea7c28617357ae46adae88ebbb6da430e"} Mar 19 11:04:04 crc kubenswrapper[4765]: I0319 11:04:04.894818 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565304-mvcqw" Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.014493 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fpcd7\" (UniqueName: \"kubernetes.io/projected/c8c122d7-d33a-4c98-970e-be77ef4539e9-kube-api-access-fpcd7\") pod \"c8c122d7-d33a-4c98-970e-be77ef4539e9\" (UID: \"c8c122d7-d33a-4c98-970e-be77ef4539e9\") " Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.026682 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c8c122d7-d33a-4c98-970e-be77ef4539e9-kube-api-access-fpcd7" (OuterVolumeSpecName: "kube-api-access-fpcd7") pod "c8c122d7-d33a-4c98-970e-be77ef4539e9" (UID: "c8c122d7-d33a-4c98-970e-be77ef4539e9"). InnerVolumeSpecName "kube-api-access-fpcd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.116535 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fpcd7\" (UniqueName: \"kubernetes.io/projected/c8c122d7-d33a-4c98-970e-be77ef4539e9-kube-api-access-fpcd7\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.527453 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565304-mvcqw" event={"ID":"c8c122d7-d33a-4c98-970e-be77ef4539e9","Type":"ContainerDied","Data":"32db1ae475d0379ac26e12a14105d66d7c69e31935ab5899b51e8c17d8d3842a"} Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.527527 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="32db1ae475d0379ac26e12a14105d66d7c69e31935ab5899b51e8c17d8d3842a" Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.527545 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565304-mvcqw" Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.970659 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565298-zd8jt"] Mar 19 11:04:05 crc kubenswrapper[4765]: I0319 11:04:05.979262 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565298-zd8jt"] Mar 19 11:04:06 crc kubenswrapper[4765]: I0319 11:04:06.375568 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="555dd4fc-9bc7-406b-b094-409a3231e411" path="/var/lib/kubelet/pods/555dd4fc-9bc7-406b-b094-409a3231e411/volumes" Mar 19 11:04:11 crc kubenswrapper[4765]: I0319 11:04:11.356274 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:04:11 crc kubenswrapper[4765]: E0319 11:04:11.357313 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:04:24 crc kubenswrapper[4765]: I0319 11:04:24.733293 4765 generic.go:334] "Generic (PLEG): container finished" podID="f9cf075c-03d2-4254-9ab9-5500d4f42186" containerID="ad6a35415cfed9684eaf139936e0b0a62c54b279abfc24e8bbea14e59dde6355" exitCode=0 Mar 19 11:04:24 crc kubenswrapper[4765]: I0319 11:04:24.733362 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" event={"ID":"f9cf075c-03d2-4254-9ab9-5500d4f42186","Type":"ContainerDied","Data":"ad6a35415cfed9684eaf139936e0b0a62c54b279abfc24e8bbea14e59dde6355"} Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.306020 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.358011 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:04:26 crc kubenswrapper[4765]: E0319 11:04:26.358369 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469089 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-1\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469515 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-3\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469571 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-combined-ca-bundle\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469635 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-inventory\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469747 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-extra-config-0\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469809 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-2\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469836 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-0\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469886 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-ssh-key-openstack-edpm-ipam\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.469927 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-1\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.470008 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cjvgd\" (UniqueName: \"kubernetes.io/projected/f9cf075c-03d2-4254-9ab9-5500d4f42186-kube-api-access-cjvgd\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.470054 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-0\") pod \"f9cf075c-03d2-4254-9ab9-5500d4f42186\" (UID: \"f9cf075c-03d2-4254-9ab9-5500d4f42186\") " Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.478649 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-combined-ca-bundle" (OuterVolumeSpecName: "nova-combined-ca-bundle") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.488355 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f9cf075c-03d2-4254-9ab9-5500d4f42186-kube-api-access-cjvgd" (OuterVolumeSpecName: "kube-api-access-cjvgd") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "kube-api-access-cjvgd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.504276 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-1" (OuterVolumeSpecName: "nova-migration-ssh-key-1") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-migration-ssh-key-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.505649 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-extra-config-0" (OuterVolumeSpecName: "nova-extra-config-0") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-extra-config-0". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.515224 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-inventory" (OuterVolumeSpecName: "inventory") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.516887 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-2" (OuterVolumeSpecName: "nova-cell1-compute-config-2") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-cell1-compute-config-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.521213 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.523885 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-0" (OuterVolumeSpecName: "nova-cell1-compute-config-0") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-cell1-compute-config-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.527286 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-3" (OuterVolumeSpecName: "nova-cell1-compute-config-3") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-cell1-compute-config-3". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.531188 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-1" (OuterVolumeSpecName: "nova-cell1-compute-config-1") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-cell1-compute-config-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.539159 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-0" (OuterVolumeSpecName: "nova-migration-ssh-key-0") pod "f9cf075c-03d2-4254-9ab9-5500d4f42186" (UID: "f9cf075c-03d2-4254-9ab9-5500d4f42186"). InnerVolumeSpecName "nova-migration-ssh-key-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.573191 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cjvgd\" (UniqueName: \"kubernetes.io/projected/f9cf075c-03d2-4254-9ab9-5500d4f42186-kube-api-access-cjvgd\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.573462 4765 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-0\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.573550 4765 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-1\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.573634 4765 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-3\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-3\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.573729 4765 reconciler_common.go:293] "Volume detached for volume \"nova-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.573817 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.573901 4765 reconciler_common.go:293] "Volume detached for volume \"nova-extra-config-0\" (UniqueName: \"kubernetes.io/configmap/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-extra-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.574004 4765 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-0\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-0\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.574083 4765 reconciler_common.go:293] "Volume detached for volume \"nova-cell1-compute-config-2\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-cell1-compute-config-2\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.574161 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.574233 4765 reconciler_common.go:293] "Volume detached for volume \"nova-migration-ssh-key-1\" (UniqueName: \"kubernetes.io/secret/f9cf075c-03d2-4254-9ab9-5500d4f42186-nova-migration-ssh-key-1\") on node \"crc\" DevicePath \"\"" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.760100 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" event={"ID":"f9cf075c-03d2-4254-9ab9-5500d4f42186","Type":"ContainerDied","Data":"8d10a35e176a91c6d0b8644f2ae5509030b79a556c7e24de96d0f1c6b5664192"} Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.760166 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8d10a35e176a91c6d0b8644f2ae5509030b79a556c7e24de96d0f1c6b5664192" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.760634 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/nova-edpm-deployment-openstack-edpm-ipam-skpp9" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.883623 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq"] Mar 19 11:04:26 crc kubenswrapper[4765]: E0319 11:04:26.884037 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c8c122d7-d33a-4c98-970e-be77ef4539e9" containerName="oc" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.884058 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c8c122d7-d33a-4c98-970e-be77ef4539e9" containerName="oc" Mar 19 11:04:26 crc kubenswrapper[4765]: E0319 11:04:26.884080 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f9cf075c-03d2-4254-9ab9-5500d4f42186" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.884086 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f9cf075c-03d2-4254-9ab9-5500d4f42186" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.884237 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c8c122d7-d33a-4c98-970e-be77ef4539e9" containerName="oc" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.884255 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f9cf075c-03d2-4254-9ab9-5500d4f42186" containerName="nova-edpm-deployment-openstack-edpm-ipam" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.884871 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.890904 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"ceilometer-compute-config-data" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.891421 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplane-ansible-ssh-private-key-secret" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.891820 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"openstack-edpm-ipam-dockercfg-kbh2c" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.891926 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dataplanenodeset-openstack-edpm-ipam" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.891941 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openstack-aee-default-env" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.900448 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq"] Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.980686 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.980881 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.981078 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.981162 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6tlx\" (UniqueName: \"kubernetes.io/projected/611d61c3-8dd1-46e4-a579-ded4e91917ed-kube-api-access-p6tlx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.981316 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.981472 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:26 crc kubenswrapper[4765]: I0319 11:04:26.981519 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.083058 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.083344 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p6tlx\" (UniqueName: \"kubernetes.io/projected/611d61c3-8dd1-46e4-a579-ded4e91917ed-kube-api-access-p6tlx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.083484 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.083602 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.083712 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.083846 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.083994 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.087867 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-1\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.087930 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-inventory\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.088369 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-telemetry-combined-ca-bundle\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.088769 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-0\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.101371 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-2\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.101816 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ssh-key-openstack-edpm-ipam\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.105419 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p6tlx\" (UniqueName: \"kubernetes.io/projected/611d61c3-8dd1-46e4-a579-ded4e91917ed-kube-api-access-p6tlx\") pod \"telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.205934 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.739163 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq"] Mar 19 11:04:27 crc kubenswrapper[4765]: I0319 11:04:27.769373 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" event={"ID":"611d61c3-8dd1-46e4-a579-ded4e91917ed","Type":"ContainerStarted","Data":"bdf2cba5e4701c58dbf68c283085c8f1af79cc37ded697142d26b9d7486d6590"} Mar 19 11:04:28 crc kubenswrapper[4765]: I0319 11:04:28.782400 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" event={"ID":"611d61c3-8dd1-46e4-a579-ded4e91917ed","Type":"ContainerStarted","Data":"0ff3b0108ec3fb5b9705b9f71a82dd68dab7d98f155b8624fd7ad7b346e55603"} Mar 19 11:04:28 crc kubenswrapper[4765]: I0319 11:04:28.812470 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" podStartSLOduration=2.203498251 podStartE2EDuration="2.812452742s" podCreationTimestamp="2026-03-19 11:04:26 +0000 UTC" firstStartedPulling="2026-03-19 11:04:27.746499724 +0000 UTC m=+2566.095445266" lastFinishedPulling="2026-03-19 11:04:28.355454195 +0000 UTC m=+2566.704399757" observedRunningTime="2026-03-19 11:04:28.804758946 +0000 UTC m=+2567.153704488" watchObservedRunningTime="2026-03-19 11:04:28.812452742 +0000 UTC m=+2567.161398284" Mar 19 11:04:37 crc kubenswrapper[4765]: I0319 11:04:37.357247 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:04:37 crc kubenswrapper[4765]: E0319 11:04:37.358339 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:04:52 crc kubenswrapper[4765]: I0319 11:04:52.395516 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:04:52 crc kubenswrapper[4765]: E0319 11:04:52.396775 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:05:00 crc kubenswrapper[4765]: I0319 11:05:00.996237 4765 scope.go:117] "RemoveContainer" containerID="13252fa8860b4c68428c4a9bb1f57507f81f71690225a633cb5da068dc3a1148" Mar 19 11:05:05 crc kubenswrapper[4765]: I0319 11:05:05.356055 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:05:05 crc kubenswrapper[4765]: E0319 11:05:05.356681 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:05:17 crc kubenswrapper[4765]: I0319 11:05:17.355719 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:05:17 crc kubenswrapper[4765]: E0319 11:05:17.356529 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:05:28 crc kubenswrapper[4765]: I0319 11:05:28.356727 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:05:28 crc kubenswrapper[4765]: E0319 11:05:28.357446 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:05:43 crc kubenswrapper[4765]: I0319 11:05:43.357419 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:05:43 crc kubenswrapper[4765]: E0319 11:05:43.358805 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:05:57 crc kubenswrapper[4765]: I0319 11:05:57.356261 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:05:57 crc kubenswrapper[4765]: E0319 11:05:57.357091 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.153893 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565306-gplrg"] Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.156118 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565306-gplrg" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.158385 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.158698 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.160506 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.162156 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565306-gplrg"] Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.340899 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sv7jf\" (UniqueName: \"kubernetes.io/projected/3034496b-4e9f-4e55-a348-691b747da728-kube-api-access-sv7jf\") pod \"auto-csr-approver-29565306-gplrg\" (UID: \"3034496b-4e9f-4e55-a348-691b747da728\") " pod="openshift-infra/auto-csr-approver-29565306-gplrg" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.443773 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sv7jf\" (UniqueName: \"kubernetes.io/projected/3034496b-4e9f-4e55-a348-691b747da728-kube-api-access-sv7jf\") pod \"auto-csr-approver-29565306-gplrg\" (UID: \"3034496b-4e9f-4e55-a348-691b747da728\") " pod="openshift-infra/auto-csr-approver-29565306-gplrg" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.468049 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sv7jf\" (UniqueName: \"kubernetes.io/projected/3034496b-4e9f-4e55-a348-691b747da728-kube-api-access-sv7jf\") pod \"auto-csr-approver-29565306-gplrg\" (UID: \"3034496b-4e9f-4e55-a348-691b747da728\") " pod="openshift-infra/auto-csr-approver-29565306-gplrg" Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.481060 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565306-gplrg" Mar 19 11:06:00 crc kubenswrapper[4765]: W0319 11:06:00.966368 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3034496b_4e9f_4e55_a348_691b747da728.slice/crio-90b63b88a84881e0dd01eac02123d58a15062675bc2da3ef24212567a16f5ce2 WatchSource:0}: Error finding container 90b63b88a84881e0dd01eac02123d58a15062675bc2da3ef24212567a16f5ce2: Status 404 returned error can't find the container with id 90b63b88a84881e0dd01eac02123d58a15062675bc2da3ef24212567a16f5ce2 Mar 19 11:06:00 crc kubenswrapper[4765]: I0319 11:06:00.974978 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565306-gplrg"] Mar 19 11:06:01 crc kubenswrapper[4765]: I0319 11:06:01.750552 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565306-gplrg" event={"ID":"3034496b-4e9f-4e55-a348-691b747da728","Type":"ContainerStarted","Data":"90b63b88a84881e0dd01eac02123d58a15062675bc2da3ef24212567a16f5ce2"} Mar 19 11:06:03 crc kubenswrapper[4765]: I0319 11:06:03.790684 4765 generic.go:334] "Generic (PLEG): container finished" podID="3034496b-4e9f-4e55-a348-691b747da728" containerID="37797939112b7f4b03a51c87fc26899c1a25e98e160adea32ae3b7d6f3c3e4bd" exitCode=0 Mar 19 11:06:03 crc kubenswrapper[4765]: I0319 11:06:03.791362 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565306-gplrg" event={"ID":"3034496b-4e9f-4e55-a348-691b747da728","Type":"ContainerDied","Data":"37797939112b7f4b03a51c87fc26899c1a25e98e160adea32ae3b7d6f3c3e4bd"} Mar 19 11:06:05 crc kubenswrapper[4765]: I0319 11:06:05.190494 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565306-gplrg" Mar 19 11:06:05 crc kubenswrapper[4765]: I0319 11:06:05.338025 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sv7jf\" (UniqueName: \"kubernetes.io/projected/3034496b-4e9f-4e55-a348-691b747da728-kube-api-access-sv7jf\") pod \"3034496b-4e9f-4e55-a348-691b747da728\" (UID: \"3034496b-4e9f-4e55-a348-691b747da728\") " Mar 19 11:06:05 crc kubenswrapper[4765]: I0319 11:06:05.345234 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3034496b-4e9f-4e55-a348-691b747da728-kube-api-access-sv7jf" (OuterVolumeSpecName: "kube-api-access-sv7jf") pod "3034496b-4e9f-4e55-a348-691b747da728" (UID: "3034496b-4e9f-4e55-a348-691b747da728"). InnerVolumeSpecName "kube-api-access-sv7jf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:06:05 crc kubenswrapper[4765]: I0319 11:06:05.440085 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sv7jf\" (UniqueName: \"kubernetes.io/projected/3034496b-4e9f-4e55-a348-691b747da728-kube-api-access-sv7jf\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:05 crc kubenswrapper[4765]: I0319 11:06:05.817380 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565306-gplrg" event={"ID":"3034496b-4e9f-4e55-a348-691b747da728","Type":"ContainerDied","Data":"90b63b88a84881e0dd01eac02123d58a15062675bc2da3ef24212567a16f5ce2"} Mar 19 11:06:05 crc kubenswrapper[4765]: I0319 11:06:05.817437 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90b63b88a84881e0dd01eac02123d58a15062675bc2da3ef24212567a16f5ce2" Mar 19 11:06:05 crc kubenswrapper[4765]: I0319 11:06:05.817448 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565306-gplrg" Mar 19 11:06:06 crc kubenswrapper[4765]: I0319 11:06:06.267344 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565300-9bfw7"] Mar 19 11:06:06 crc kubenswrapper[4765]: I0319 11:06:06.275673 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565300-9bfw7"] Mar 19 11:06:06 crc kubenswrapper[4765]: I0319 11:06:06.366184 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0f824049-f8e9-4909-836d-2bb5ac32f722" path="/var/lib/kubelet/pods/0f824049-f8e9-4909-836d-2bb5ac32f722/volumes" Mar 19 11:06:11 crc kubenswrapper[4765]: I0319 11:06:11.356242 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:06:11 crc kubenswrapper[4765]: E0319 11:06:11.357240 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.864293 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-6rrpf"] Mar 19 11:06:20 crc kubenswrapper[4765]: E0319 11:06:20.865072 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3034496b-4e9f-4e55-a348-691b747da728" containerName="oc" Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.865088 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3034496b-4e9f-4e55-a348-691b747da728" containerName="oc" Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.865338 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3034496b-4e9f-4e55-a348-691b747da728" containerName="oc" Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.866649 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.892854 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rrpf"] Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.985343 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c5d371b-d02f-497c-af99-4c138232f8c0-catalog-content\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.985755 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twxzm\" (UniqueName: \"kubernetes.io/projected/1c5d371b-d02f-497c-af99-4c138232f8c0-kube-api-access-twxzm\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:20 crc kubenswrapper[4765]: I0319 11:06:20.985943 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c5d371b-d02f-497c-af99-4c138232f8c0-utilities\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.087797 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c5d371b-d02f-497c-af99-4c138232f8c0-catalog-content\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.087920 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twxzm\" (UniqueName: \"kubernetes.io/projected/1c5d371b-d02f-497c-af99-4c138232f8c0-kube-api-access-twxzm\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.087979 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c5d371b-d02f-497c-af99-4c138232f8c0-utilities\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.088498 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1c5d371b-d02f-497c-af99-4c138232f8c0-catalog-content\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.088507 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1c5d371b-d02f-497c-af99-4c138232f8c0-utilities\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.106815 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twxzm\" (UniqueName: \"kubernetes.io/projected/1c5d371b-d02f-497c-af99-4c138232f8c0-kube-api-access-twxzm\") pod \"community-operators-6rrpf\" (UID: \"1c5d371b-d02f-497c-af99-4c138232f8c0\") " pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.187367 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.569198 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rrpf"] Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.972732 4765 generic.go:334] "Generic (PLEG): container finished" podID="1c5d371b-d02f-497c-af99-4c138232f8c0" containerID="5fc8eaaeaeac2f5fb553b686235a05b995f3f80c4563e5144afd68c5e54ea55f" exitCode=0 Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.972791 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rrpf" event={"ID":"1c5d371b-d02f-497c-af99-4c138232f8c0","Type":"ContainerDied","Data":"5fc8eaaeaeac2f5fb553b686235a05b995f3f80c4563e5144afd68c5e54ea55f"} Mar 19 11:06:21 crc kubenswrapper[4765]: I0319 11:06:21.973065 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rrpf" event={"ID":"1c5d371b-d02f-497c-af99-4c138232f8c0","Type":"ContainerStarted","Data":"d0f83049245ddf4f84be3c1e7ca7f8ec7b54b604f8e00d26d9c978df7418c927"} Mar 19 11:06:24 crc kubenswrapper[4765]: I0319 11:06:24.356588 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:06:24 crc kubenswrapper[4765]: E0319 11:06:24.357583 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:06:27 crc kubenswrapper[4765]: I0319 11:06:27.031058 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rrpf" event={"ID":"1c5d371b-d02f-497c-af99-4c138232f8c0","Type":"ContainerStarted","Data":"edb94ebc2c4400275171b26d92593ba03400ae3b630bd9e633370711cbcc0cb2"} Mar 19 11:06:28 crc kubenswrapper[4765]: I0319 11:06:28.044884 4765 generic.go:334] "Generic (PLEG): container finished" podID="1c5d371b-d02f-497c-af99-4c138232f8c0" containerID="edb94ebc2c4400275171b26d92593ba03400ae3b630bd9e633370711cbcc0cb2" exitCode=0 Mar 19 11:06:28 crc kubenswrapper[4765]: I0319 11:06:28.045053 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rrpf" event={"ID":"1c5d371b-d02f-497c-af99-4c138232f8c0","Type":"ContainerDied","Data":"edb94ebc2c4400275171b26d92593ba03400ae3b630bd9e633370711cbcc0cb2"} Mar 19 11:06:29 crc kubenswrapper[4765]: I0319 11:06:29.056081 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-6rrpf" event={"ID":"1c5d371b-d02f-497c-af99-4c138232f8c0","Type":"ContainerStarted","Data":"ea0fe767238cd77c342f0e1fed387a7d3644d362d13539f518c85beebb311216"} Mar 19 11:06:29 crc kubenswrapper[4765]: I0319 11:06:29.084284 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-6rrpf" podStartSLOduration=2.409766012 podStartE2EDuration="9.084260724s" podCreationTimestamp="2026-03-19 11:06:20 +0000 UTC" firstStartedPulling="2026-03-19 11:06:21.974763448 +0000 UTC m=+2680.323708980" lastFinishedPulling="2026-03-19 11:06:28.64925815 +0000 UTC m=+2686.998203692" observedRunningTime="2026-03-19 11:06:29.077939053 +0000 UTC m=+2687.426884595" watchObservedRunningTime="2026-03-19 11:06:29.084260724 +0000 UTC m=+2687.433206266" Mar 19 11:06:31 crc kubenswrapper[4765]: I0319 11:06:31.188236 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:31 crc kubenswrapper[4765]: I0319 11:06:31.188621 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:31 crc kubenswrapper[4765]: I0319 11:06:31.237445 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:38 crc kubenswrapper[4765]: I0319 11:06:38.356558 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:06:38 crc kubenswrapper[4765]: E0319 11:06:38.357476 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:06:41 crc kubenswrapper[4765]: I0319 11:06:41.238537 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-6rrpf" Mar 19 11:06:41 crc kubenswrapper[4765]: I0319 11:06:41.309463 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-6rrpf"] Mar 19 11:06:41 crc kubenswrapper[4765]: I0319 11:06:41.357875 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqvrg"] Mar 19 11:06:41 crc kubenswrapper[4765]: I0319 11:06:41.358196 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-kqvrg" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="registry-server" containerID="cri-o://640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba" gracePeriod=2 Mar 19 11:06:41 crc kubenswrapper[4765]: I0319 11:06:41.860085 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.033818 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-catalog-content\") pod \"306b7883-2d24-4ff3-9154-8a45be8447fb\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.033893 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-29n7v\" (UniqueName: \"kubernetes.io/projected/306b7883-2d24-4ff3-9154-8a45be8447fb-kube-api-access-29n7v\") pod \"306b7883-2d24-4ff3-9154-8a45be8447fb\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.034068 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-utilities\") pod \"306b7883-2d24-4ff3-9154-8a45be8447fb\" (UID: \"306b7883-2d24-4ff3-9154-8a45be8447fb\") " Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.035331 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-utilities" (OuterVolumeSpecName: "utilities") pod "306b7883-2d24-4ff3-9154-8a45be8447fb" (UID: "306b7883-2d24-4ff3-9154-8a45be8447fb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.042265 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/306b7883-2d24-4ff3-9154-8a45be8447fb-kube-api-access-29n7v" (OuterVolumeSpecName: "kube-api-access-29n7v") pod "306b7883-2d24-4ff3-9154-8a45be8447fb" (UID: "306b7883-2d24-4ff3-9154-8a45be8447fb"). InnerVolumeSpecName "kube-api-access-29n7v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.097017 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "306b7883-2d24-4ff3-9154-8a45be8447fb" (UID: "306b7883-2d24-4ff3-9154-8a45be8447fb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.136735 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.136786 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-29n7v\" (UniqueName: \"kubernetes.io/projected/306b7883-2d24-4ff3-9154-8a45be8447fb-kube-api-access-29n7v\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.136802 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/306b7883-2d24-4ff3-9154-8a45be8447fb-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.189453 4765 generic.go:334] "Generic (PLEG): container finished" podID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerID="640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba" exitCode=0 Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.189516 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-kqvrg" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.189529 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqvrg" event={"ID":"306b7883-2d24-4ff3-9154-8a45be8447fb","Type":"ContainerDied","Data":"640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba"} Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.189570 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-kqvrg" event={"ID":"306b7883-2d24-4ff3-9154-8a45be8447fb","Type":"ContainerDied","Data":"0b3ca35060e2d5959765e1a6d0a52873c31dc1f10de60f849dd1e565e492586b"} Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.189590 4765 scope.go:117] "RemoveContainer" containerID="640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.225369 4765 scope.go:117] "RemoveContainer" containerID="ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.235481 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-kqvrg"] Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.244615 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-kqvrg"] Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.251323 4765 scope.go:117] "RemoveContainer" containerID="f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.300106 4765 scope.go:117] "RemoveContainer" containerID="640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba" Mar 19 11:06:42 crc kubenswrapper[4765]: E0319 11:06:42.300636 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba\": container with ID starting with 640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba not found: ID does not exist" containerID="640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.300685 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba"} err="failed to get container status \"640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba\": rpc error: code = NotFound desc = could not find container \"640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba\": container with ID starting with 640bdf4c69d151e9731e1dcbbf410a6e8dd702920781661edafa1469372a76ba not found: ID does not exist" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.300719 4765 scope.go:117] "RemoveContainer" containerID="ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4" Mar 19 11:06:42 crc kubenswrapper[4765]: E0319 11:06:42.301444 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4\": container with ID starting with ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4 not found: ID does not exist" containerID="ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.301482 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4"} err="failed to get container status \"ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4\": rpc error: code = NotFound desc = could not find container \"ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4\": container with ID starting with ace9d7f04242335e34d1456dd684b3c2f116a903cb46ba93307161b055e4fba4 not found: ID does not exist" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.301500 4765 scope.go:117] "RemoveContainer" containerID="f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2" Mar 19 11:06:42 crc kubenswrapper[4765]: E0319 11:06:42.301790 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2\": container with ID starting with f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2 not found: ID does not exist" containerID="f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.301829 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2"} err="failed to get container status \"f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2\": rpc error: code = NotFound desc = could not find container \"f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2\": container with ID starting with f896e306dc469f84129a45d98162abd1a35a12ce71d7b055a33b0ad413a4fbd2 not found: ID does not exist" Mar 19 11:06:42 crc kubenswrapper[4765]: I0319 11:06:42.424221 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" path="/var/lib/kubelet/pods/306b7883-2d24-4ff3-9154-8a45be8447fb/volumes" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.355986 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-d2lck"] Mar 19 11:06:47 crc kubenswrapper[4765]: E0319 11:06:47.357072 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="registry-server" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.357090 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="registry-server" Mar 19 11:06:47 crc kubenswrapper[4765]: E0319 11:06:47.357130 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="extract-content" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.357138 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="extract-content" Mar 19 11:06:47 crc kubenswrapper[4765]: E0319 11:06:47.357151 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="extract-utilities" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.357160 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="extract-utilities" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.357396 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="306b7883-2d24-4ff3-9154-8a45be8447fb" containerName="registry-server" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.359030 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.374820 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2lck"] Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.541192 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7kt2b\" (UniqueName: \"kubernetes.io/projected/b0a58266-7d6b-4b46-8723-821c8b4fde2c-kube-api-access-7kt2b\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.541366 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-utilities\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.541432 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-catalog-content\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.643209 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7kt2b\" (UniqueName: \"kubernetes.io/projected/b0a58266-7d6b-4b46-8723-821c8b4fde2c-kube-api-access-7kt2b\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.643347 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-utilities\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.643402 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-catalog-content\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.643883 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-utilities\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.643937 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-catalog-content\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.680065 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7kt2b\" (UniqueName: \"kubernetes.io/projected/b0a58266-7d6b-4b46-8723-821c8b4fde2c-kube-api-access-7kt2b\") pod \"certified-operators-d2lck\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:47 crc kubenswrapper[4765]: I0319 11:06:47.978170 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:48 crc kubenswrapper[4765]: I0319 11:06:48.453352 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-d2lck"] Mar 19 11:06:49 crc kubenswrapper[4765]: I0319 11:06:49.254721 4765 generic.go:334] "Generic (PLEG): container finished" podID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerID="1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094" exitCode=0 Mar 19 11:06:49 crc kubenswrapper[4765]: I0319 11:06:49.254927 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2lck" event={"ID":"b0a58266-7d6b-4b46-8723-821c8b4fde2c","Type":"ContainerDied","Data":"1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094"} Mar 19 11:06:49 crc kubenswrapper[4765]: I0319 11:06:49.255076 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2lck" event={"ID":"b0a58266-7d6b-4b46-8723-821c8b4fde2c","Type":"ContainerStarted","Data":"943dc590e0a21192e20fb5a24afc236e6252e756a264e11bff46d20bd3c7cd8e"} Mar 19 11:06:51 crc kubenswrapper[4765]: I0319 11:06:51.275356 4765 generic.go:334] "Generic (PLEG): container finished" podID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerID="c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8" exitCode=0 Mar 19 11:06:51 crc kubenswrapper[4765]: I0319 11:06:51.275434 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2lck" event={"ID":"b0a58266-7d6b-4b46-8723-821c8b4fde2c","Type":"ContainerDied","Data":"c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8"} Mar 19 11:06:52 crc kubenswrapper[4765]: I0319 11:06:52.362757 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:06:52 crc kubenswrapper[4765]: E0319 11:06:52.363382 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:06:53 crc kubenswrapper[4765]: I0319 11:06:53.298940 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2lck" event={"ID":"b0a58266-7d6b-4b46-8723-821c8b4fde2c","Type":"ContainerStarted","Data":"5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda"} Mar 19 11:06:53 crc kubenswrapper[4765]: I0319 11:06:53.301652 4765 generic.go:334] "Generic (PLEG): container finished" podID="611d61c3-8dd1-46e4-a579-ded4e91917ed" containerID="0ff3b0108ec3fb5b9705b9f71a82dd68dab7d98f155b8624fd7ad7b346e55603" exitCode=0 Mar 19 11:06:53 crc kubenswrapper[4765]: I0319 11:06:53.301813 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" event={"ID":"611d61c3-8dd1-46e4-a579-ded4e91917ed","Type":"ContainerDied","Data":"0ff3b0108ec3fb5b9705b9f71a82dd68dab7d98f155b8624fd7ad7b346e55603"} Mar 19 11:06:53 crc kubenswrapper[4765]: I0319 11:06:53.322233 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-d2lck" podStartSLOduration=3.155723362 podStartE2EDuration="6.32221517s" podCreationTimestamp="2026-03-19 11:06:47 +0000 UTC" firstStartedPulling="2026-03-19 11:06:49.256622685 +0000 UTC m=+2707.605568227" lastFinishedPulling="2026-03-19 11:06:52.423114493 +0000 UTC m=+2710.772060035" observedRunningTime="2026-03-19 11:06:53.317542593 +0000 UTC m=+2711.666488145" watchObservedRunningTime="2026-03-19 11:06:53.32221517 +0000 UTC m=+2711.671160722" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.731926 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.894833 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-1\") pod \"611d61c3-8dd1-46e4-a579-ded4e91917ed\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.895185 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-inventory\") pod \"611d61c3-8dd1-46e4-a579-ded4e91917ed\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.895242 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-0\") pod \"611d61c3-8dd1-46e4-a579-ded4e91917ed\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.895309 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ssh-key-openstack-edpm-ipam\") pod \"611d61c3-8dd1-46e4-a579-ded4e91917ed\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.895358 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p6tlx\" (UniqueName: \"kubernetes.io/projected/611d61c3-8dd1-46e4-a579-ded4e91917ed-kube-api-access-p6tlx\") pod \"611d61c3-8dd1-46e4-a579-ded4e91917ed\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.895422 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-2\") pod \"611d61c3-8dd1-46e4-a579-ded4e91917ed\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.895488 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-telemetry-combined-ca-bundle\") pod \"611d61c3-8dd1-46e4-a579-ded4e91917ed\" (UID: \"611d61c3-8dd1-46e4-a579-ded4e91917ed\") " Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.902615 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/611d61c3-8dd1-46e4-a579-ded4e91917ed-kube-api-access-p6tlx" (OuterVolumeSpecName: "kube-api-access-p6tlx") pod "611d61c3-8dd1-46e4-a579-ded4e91917ed" (UID: "611d61c3-8dd1-46e4-a579-ded4e91917ed"). InnerVolumeSpecName "kube-api-access-p6tlx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.902780 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-telemetry-combined-ca-bundle" (OuterVolumeSpecName: "telemetry-combined-ca-bundle") pod "611d61c3-8dd1-46e4-a579-ded4e91917ed" (UID: "611d61c3-8dd1-46e4-a579-ded4e91917ed"). InnerVolumeSpecName "telemetry-combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.929558 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-1" (OuterVolumeSpecName: "ceilometer-compute-config-data-1") pod "611d61c3-8dd1-46e4-a579-ded4e91917ed" (UID: "611d61c3-8dd1-46e4-a579-ded4e91917ed"). InnerVolumeSpecName "ceilometer-compute-config-data-1". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.932743 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-inventory" (OuterVolumeSpecName: "inventory") pod "611d61c3-8dd1-46e4-a579-ded4e91917ed" (UID: "611d61c3-8dd1-46e4-a579-ded4e91917ed"). InnerVolumeSpecName "inventory". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.933936 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ssh-key-openstack-edpm-ipam" (OuterVolumeSpecName: "ssh-key-openstack-edpm-ipam") pod "611d61c3-8dd1-46e4-a579-ded4e91917ed" (UID: "611d61c3-8dd1-46e4-a579-ded4e91917ed"). InnerVolumeSpecName "ssh-key-openstack-edpm-ipam". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.935010 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-2" (OuterVolumeSpecName: "ceilometer-compute-config-data-2") pod "611d61c3-8dd1-46e4-a579-ded4e91917ed" (UID: "611d61c3-8dd1-46e4-a579-ded4e91917ed"). InnerVolumeSpecName "ceilometer-compute-config-data-2". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.980944 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-0" (OuterVolumeSpecName: "ceilometer-compute-config-data-0") pod "611d61c3-8dd1-46e4-a579-ded4e91917ed" (UID: "611d61c3-8dd1-46e4-a579-ded4e91917ed"). InnerVolumeSpecName "ceilometer-compute-config-data-0". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.998670 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-2\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-2\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.998705 4765 reconciler_common.go:293] "Volume detached for volume \"telemetry-combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-telemetry-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.998717 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-1\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-1\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.998729 4765 reconciler_common.go:293] "Volume detached for volume \"inventory\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-inventory\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.998740 4765 reconciler_common.go:293] "Volume detached for volume \"ceilometer-compute-config-data-0\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ceilometer-compute-config-data-0\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.998751 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key-openstack-edpm-ipam\" (UniqueName: \"kubernetes.io/secret/611d61c3-8dd1-46e4-a579-ded4e91917ed-ssh-key-openstack-edpm-ipam\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:54 crc kubenswrapper[4765]: I0319 11:06:54.998762 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p6tlx\" (UniqueName: \"kubernetes.io/projected/611d61c3-8dd1-46e4-a579-ded4e91917ed-kube-api-access-p6tlx\") on node \"crc\" DevicePath \"\"" Mar 19 11:06:55 crc kubenswrapper[4765]: I0319 11:06:55.320197 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" event={"ID":"611d61c3-8dd1-46e4-a579-ded4e91917ed","Type":"ContainerDied","Data":"bdf2cba5e4701c58dbf68c283085c8f1af79cc37ded697142d26b9d7486d6590"} Mar 19 11:06:55 crc kubenswrapper[4765]: I0319 11:06:55.320243 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdf2cba5e4701c58dbf68c283085c8f1af79cc37ded697142d26b9d7486d6590" Mar 19 11:06:55 crc kubenswrapper[4765]: I0319 11:06:55.320261 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq" Mar 19 11:06:57 crc kubenswrapper[4765]: I0319 11:06:57.978719 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:57 crc kubenswrapper[4765]: I0319 11:06:57.979055 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:58 crc kubenswrapper[4765]: I0319 11:06:58.036536 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:58 crc kubenswrapper[4765]: I0319 11:06:58.395548 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:06:58 crc kubenswrapper[4765]: I0319 11:06:58.446293 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2lck"] Mar 19 11:07:00 crc kubenswrapper[4765]: I0319 11:07:00.371631 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-d2lck" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="registry-server" containerID="cri-o://5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda" gracePeriod=2 Mar 19 11:07:00 crc kubenswrapper[4765]: I0319 11:07:00.846695 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.023626 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7kt2b\" (UniqueName: \"kubernetes.io/projected/b0a58266-7d6b-4b46-8723-821c8b4fde2c-kube-api-access-7kt2b\") pod \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.023708 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-utilities\") pod \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.023878 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-catalog-content\") pod \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\" (UID: \"b0a58266-7d6b-4b46-8723-821c8b4fde2c\") " Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.025547 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-utilities" (OuterVolumeSpecName: "utilities") pod "b0a58266-7d6b-4b46-8723-821c8b4fde2c" (UID: "b0a58266-7d6b-4b46-8723-821c8b4fde2c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.034302 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b0a58266-7d6b-4b46-8723-821c8b4fde2c-kube-api-access-7kt2b" (OuterVolumeSpecName: "kube-api-access-7kt2b") pod "b0a58266-7d6b-4b46-8723-821c8b4fde2c" (UID: "b0a58266-7d6b-4b46-8723-821c8b4fde2c"). InnerVolumeSpecName "kube-api-access-7kt2b". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.092359 4765 scope.go:117] "RemoveContainer" containerID="eff6eca1fe5d0d1e0b4882c99d4314a795561f5a0400bb936acd85ec1e3dc4cf" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.097985 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b0a58266-7d6b-4b46-8723-821c8b4fde2c" (UID: "b0a58266-7d6b-4b46-8723-821c8b4fde2c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.126945 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.127007 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7kt2b\" (UniqueName: \"kubernetes.io/projected/b0a58266-7d6b-4b46-8723-821c8b4fde2c-kube-api-access-7kt2b\") on node \"crc\" DevicePath \"\"" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.127025 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b0a58266-7d6b-4b46-8723-821c8b4fde2c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.385131 4765 generic.go:334] "Generic (PLEG): container finished" podID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerID="5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda" exitCode=0 Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.385199 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2lck" event={"ID":"b0a58266-7d6b-4b46-8723-821c8b4fde2c","Type":"ContainerDied","Data":"5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda"} Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.385236 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-d2lck" event={"ID":"b0a58266-7d6b-4b46-8723-821c8b4fde2c","Type":"ContainerDied","Data":"943dc590e0a21192e20fb5a24afc236e6252e756a264e11bff46d20bd3c7cd8e"} Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.385249 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-d2lck" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.385266 4765 scope.go:117] "RemoveContainer" containerID="5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.427254 4765 scope.go:117] "RemoveContainer" containerID="c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.439368 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-d2lck"] Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.452785 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-d2lck"] Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.464673 4765 scope.go:117] "RemoveContainer" containerID="1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.487996 4765 scope.go:117] "RemoveContainer" containerID="5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda" Mar 19 11:07:01 crc kubenswrapper[4765]: E0319 11:07:01.488699 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda\": container with ID starting with 5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda not found: ID does not exist" containerID="5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.488795 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda"} err="failed to get container status \"5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda\": rpc error: code = NotFound desc = could not find container \"5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda\": container with ID starting with 5093b410b3a389a154796a8d9be15c7790ead2dc7d2ba755b8fdea34dd994cda not found: ID does not exist" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.488864 4765 scope.go:117] "RemoveContainer" containerID="c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8" Mar 19 11:07:01 crc kubenswrapper[4765]: E0319 11:07:01.490421 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8\": container with ID starting with c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8 not found: ID does not exist" containerID="c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.490497 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8"} err="failed to get container status \"c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8\": rpc error: code = NotFound desc = could not find container \"c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8\": container with ID starting with c5de56e285bcb0e04d7bab04172ca89c965245c388d252f3261e529fc068cae8 not found: ID does not exist" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.490540 4765 scope.go:117] "RemoveContainer" containerID="1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094" Mar 19 11:07:01 crc kubenswrapper[4765]: E0319 11:07:01.491351 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094\": container with ID starting with 1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094 not found: ID does not exist" containerID="1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094" Mar 19 11:07:01 crc kubenswrapper[4765]: I0319 11:07:01.491450 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094"} err="failed to get container status \"1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094\": rpc error: code = NotFound desc = could not find container \"1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094\": container with ID starting with 1ff227a6e9ddc71decae4cb23b5d89b1970adf855d9577923d0eabb996f69094 not found: ID does not exist" Mar 19 11:07:02 crc kubenswrapper[4765]: I0319 11:07:02.371647 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" path="/var/lib/kubelet/pods/b0a58266-7d6b-4b46-8723-821c8b4fde2c/volumes" Mar 19 11:07:05 crc kubenswrapper[4765]: I0319 11:07:05.356212 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:07:06 crc kubenswrapper[4765]: I0319 11:07:06.449255 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"13204da80ca05cc175b015bf0b4bf2b868c7ec91fcec2cd7e027044b6cd5d8e2"} Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.679499 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-smqsz"] Mar 19 11:07:31 crc kubenswrapper[4765]: E0319 11:07:31.680490 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="611d61c3-8dd1-46e4-a579-ded4e91917ed" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.680506 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="611d61c3-8dd1-46e4-a579-ded4e91917ed" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 11:07:31 crc kubenswrapper[4765]: E0319 11:07:31.680531 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="extract-content" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.680537 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="extract-content" Mar 19 11:07:31 crc kubenswrapper[4765]: E0319 11:07:31.680553 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="extract-utilities" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.680559 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="extract-utilities" Mar 19 11:07:31 crc kubenswrapper[4765]: E0319 11:07:31.680571 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="registry-server" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.680577 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="registry-server" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.680779 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="611d61c3-8dd1-46e4-a579-ded4e91917ed" containerName="telemetry-edpm-deployment-openstack-edpm-ipam" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.680793 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="b0a58266-7d6b-4b46-8723-821c8b4fde2c" containerName="registry-server" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.682170 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.688130 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smqsz"] Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.849026 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2sj9l\" (UniqueName: \"kubernetes.io/projected/a4db8f61-ac44-4817-8596-333b4dd2f4e2-kube-api-access-2sj9l\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.849080 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-utilities\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.849541 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-catalog-content\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.951255 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-catalog-content\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.951368 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2sj9l\" (UniqueName: \"kubernetes.io/projected/a4db8f61-ac44-4817-8596-333b4dd2f4e2-kube-api-access-2sj9l\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.951396 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-utilities\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.951885 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-catalog-content\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.952019 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-utilities\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:31 crc kubenswrapper[4765]: I0319 11:07:31.986068 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2sj9l\" (UniqueName: \"kubernetes.io/projected/a4db8f61-ac44-4817-8596-333b4dd2f4e2-kube-api-access-2sj9l\") pod \"redhat-operators-smqsz\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:32 crc kubenswrapper[4765]: I0319 11:07:32.015748 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:32 crc kubenswrapper[4765]: I0319 11:07:32.549677 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-smqsz"] Mar 19 11:07:32 crc kubenswrapper[4765]: I0319 11:07:32.728495 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smqsz" event={"ID":"a4db8f61-ac44-4817-8596-333b4dd2f4e2","Type":"ContainerStarted","Data":"99f2338971ed4ec4a14a116cbad3eb8e2403710382dbad909d9f1054ece1f5f0"} Mar 19 11:07:33 crc kubenswrapper[4765]: I0319 11:07:33.739899 4765 generic.go:334] "Generic (PLEG): container finished" podID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerID="395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108" exitCode=0 Mar 19 11:07:33 crc kubenswrapper[4765]: I0319 11:07:33.740011 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smqsz" event={"ID":"a4db8f61-ac44-4817-8596-333b4dd2f4e2","Type":"ContainerDied","Data":"395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108"} Mar 19 11:07:35 crc kubenswrapper[4765]: I0319 11:07:35.766829 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smqsz" event={"ID":"a4db8f61-ac44-4817-8596-333b4dd2f4e2","Type":"ContainerStarted","Data":"908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb"} Mar 19 11:07:37 crc kubenswrapper[4765]: I0319 11:07:37.789302 4765 generic.go:334] "Generic (PLEG): container finished" podID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerID="908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb" exitCode=0 Mar 19 11:07:37 crc kubenswrapper[4765]: I0319 11:07:37.789394 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smqsz" event={"ID":"a4db8f61-ac44-4817-8596-333b4dd2f4e2","Type":"ContainerDied","Data":"908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb"} Mar 19 11:07:38 crc kubenswrapper[4765]: I0319 11:07:38.802609 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smqsz" event={"ID":"a4db8f61-ac44-4817-8596-333b4dd2f4e2","Type":"ContainerStarted","Data":"d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a"} Mar 19 11:07:38 crc kubenswrapper[4765]: I0319 11:07:38.831304 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-smqsz" podStartSLOduration=3.046432256 podStartE2EDuration="7.831272638s" podCreationTimestamp="2026-03-19 11:07:31 +0000 UTC" firstStartedPulling="2026-03-19 11:07:33.742552631 +0000 UTC m=+2752.091498183" lastFinishedPulling="2026-03-19 11:07:38.527393023 +0000 UTC m=+2756.876338565" observedRunningTime="2026-03-19 11:07:38.819253473 +0000 UTC m=+2757.168199045" watchObservedRunningTime="2026-03-19 11:07:38.831272638 +0000 UTC m=+2757.180218210" Mar 19 11:07:42 crc kubenswrapper[4765]: I0319 11:07:42.016072 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:42 crc kubenswrapper[4765]: I0319 11:07:42.017983 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:43 crc kubenswrapper[4765]: I0319 11:07:43.072104 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-smqsz" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="registry-server" probeResult="failure" output=< Mar 19 11:07:43 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 11:07:43 crc kubenswrapper[4765]: > Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.790020 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.793109 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.809173 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.829981 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-whrlw" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.830209 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-custom-data-s0" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.830391 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"tempest-tests-tempest-env-vars-s0" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.830529 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"test-operator-controller-priv-key" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.886320 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.886447 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-kube-api-access-4m2wf\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.886496 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.886668 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.886799 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.886850 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.886984 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-config-data\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.887064 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.887249 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.988823 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-kube-api-access-4m2wf\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.988906 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.988975 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.989025 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.989055 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.989102 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-config-data\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.989141 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.989193 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.989238 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.989787 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-workdir\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.990509 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-temporary\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.990945 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.992427 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.993204 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-config-data\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.997921 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config-secret\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.998653 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ca-certs\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:47 crc kubenswrapper[4765]: I0319 11:07:47.999201 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ssh-key\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:48 crc kubenswrapper[4765]: I0319 11:07:48.010129 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-kube-api-access-4m2wf\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:48 crc kubenswrapper[4765]: I0319 11:07:48.045527 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"tempest-tests-tempest\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " pod="openstack/tempest-tests-tempest" Mar 19 11:07:48 crc kubenswrapper[4765]: I0319 11:07:48.163927 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 11:07:48 crc kubenswrapper[4765]: I0319 11:07:48.613836 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/tempest-tests-tempest"] Mar 19 11:07:48 crc kubenswrapper[4765]: W0319 11:07:48.621122 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65eabf0c_0a01_4d5b_aefd_d9ce064e1d66.slice/crio-90a9f228b3cf02fbbd0ef331c015a214abf2cf7294e39b3d1b092f07f17a9706 WatchSource:0}: Error finding container 90a9f228b3cf02fbbd0ef331c015a214abf2cf7294e39b3d1b092f07f17a9706: Status 404 returned error can't find the container with id 90a9f228b3cf02fbbd0ef331c015a214abf2cf7294e39b3d1b092f07f17a9706 Mar 19 11:07:48 crc kubenswrapper[4765]: I0319 11:07:48.910346 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66","Type":"ContainerStarted","Data":"90a9f228b3cf02fbbd0ef331c015a214abf2cf7294e39b3d1b092f07f17a9706"} Mar 19 11:07:52 crc kubenswrapper[4765]: I0319 11:07:52.069367 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:52 crc kubenswrapper[4765]: I0319 11:07:52.142508 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:52 crc kubenswrapper[4765]: I0319 11:07:52.304698 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smqsz"] Mar 19 11:07:53 crc kubenswrapper[4765]: I0319 11:07:53.977137 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-smqsz" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="registry-server" containerID="cri-o://d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a" gracePeriod=2 Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.526052 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.641582 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2sj9l\" (UniqueName: \"kubernetes.io/projected/a4db8f61-ac44-4817-8596-333b4dd2f4e2-kube-api-access-2sj9l\") pod \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.641758 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-catalog-content\") pod \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.641883 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-utilities\") pod \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\" (UID: \"a4db8f61-ac44-4817-8596-333b4dd2f4e2\") " Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.642888 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-utilities" (OuterVolumeSpecName: "utilities") pod "a4db8f61-ac44-4817-8596-333b4dd2f4e2" (UID: "a4db8f61-ac44-4817-8596-333b4dd2f4e2"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.649787 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a4db8f61-ac44-4817-8596-333b4dd2f4e2-kube-api-access-2sj9l" (OuterVolumeSpecName: "kube-api-access-2sj9l") pod "a4db8f61-ac44-4817-8596-333b4dd2f4e2" (UID: "a4db8f61-ac44-4817-8596-333b4dd2f4e2"). InnerVolumeSpecName "kube-api-access-2sj9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.744678 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.744719 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2sj9l\" (UniqueName: \"kubernetes.io/projected/a4db8f61-ac44-4817-8596-333b4dd2f4e2-kube-api-access-2sj9l\") on node \"crc\" DevicePath \"\"" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.859869 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a4db8f61-ac44-4817-8596-333b4dd2f4e2" (UID: "a4db8f61-ac44-4817-8596-333b4dd2f4e2"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.950045 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a4db8f61-ac44-4817-8596-333b4dd2f4e2-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.990577 4765 generic.go:334] "Generic (PLEG): container finished" podID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerID="d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a" exitCode=0 Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.990639 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smqsz" event={"ID":"a4db8f61-ac44-4817-8596-333b4dd2f4e2","Type":"ContainerDied","Data":"d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a"} Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.990668 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-smqsz" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.990695 4765 scope.go:117] "RemoveContainer" containerID="d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a" Mar 19 11:07:54 crc kubenswrapper[4765]: I0319 11:07:54.990681 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-smqsz" event={"ID":"a4db8f61-ac44-4817-8596-333b4dd2f4e2","Type":"ContainerDied","Data":"99f2338971ed4ec4a14a116cbad3eb8e2403710382dbad909d9f1054ece1f5f0"} Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.023307 4765 scope.go:117] "RemoveContainer" containerID="908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb" Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.031323 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-smqsz"] Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.041272 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-smqsz"] Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.048670 4765 scope.go:117] "RemoveContainer" containerID="395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108" Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.103836 4765 scope.go:117] "RemoveContainer" containerID="d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a" Mar 19 11:07:55 crc kubenswrapper[4765]: E0319 11:07:55.104525 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a\": container with ID starting with d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a not found: ID does not exist" containerID="d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a" Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.104592 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a"} err="failed to get container status \"d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a\": rpc error: code = NotFound desc = could not find container \"d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a\": container with ID starting with d780d4bf3820a560d19ab468d1feb9a9b2f99de3ea1d33e768d874da9910702a not found: ID does not exist" Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.104634 4765 scope.go:117] "RemoveContainer" containerID="908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb" Mar 19 11:07:55 crc kubenswrapper[4765]: E0319 11:07:55.105111 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb\": container with ID starting with 908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb not found: ID does not exist" containerID="908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb" Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.105166 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb"} err="failed to get container status \"908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb\": rpc error: code = NotFound desc = could not find container \"908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb\": container with ID starting with 908783d8a7c8998fb855f56c86e0c51ef45472cf0a611930853526ce2746e0cb not found: ID does not exist" Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.105202 4765 scope.go:117] "RemoveContainer" containerID="395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108" Mar 19 11:07:55 crc kubenswrapper[4765]: E0319 11:07:55.105668 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108\": container with ID starting with 395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108 not found: ID does not exist" containerID="395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108" Mar 19 11:07:55 crc kubenswrapper[4765]: I0319 11:07:55.105703 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108"} err="failed to get container status \"395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108\": rpc error: code = NotFound desc = could not find container \"395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108\": container with ID starting with 395d7d0b50dcdc02d71d5bb37fd81cfb769fd0c151c95bd742b2f6c68c417108 not found: ID does not exist" Mar 19 11:07:56 crc kubenswrapper[4765]: I0319 11:07:56.366700 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" path="/var/lib/kubelet/pods/a4db8f61-ac44-4817-8596-333b4dd2f4e2/volumes" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.141946 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565308-dzlmc"] Mar 19 11:08:00 crc kubenswrapper[4765]: E0319 11:08:00.143063 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="extract-utilities" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.143080 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="extract-utilities" Mar 19 11:08:00 crc kubenswrapper[4765]: E0319 11:08:00.143099 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="extract-content" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.143106 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="extract-content" Mar 19 11:08:00 crc kubenswrapper[4765]: E0319 11:08:00.143132 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="registry-server" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.143140 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="registry-server" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.143348 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a4db8f61-ac44-4817-8596-333b4dd2f4e2" containerName="registry-server" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.144138 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565308-dzlmc" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.146859 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.147472 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.147706 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.154702 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565308-dzlmc"] Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.274857 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vm6rs\" (UniqueName: \"kubernetes.io/projected/f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9-kube-api-access-vm6rs\") pod \"auto-csr-approver-29565308-dzlmc\" (UID: \"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9\") " pod="openshift-infra/auto-csr-approver-29565308-dzlmc" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.376401 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vm6rs\" (UniqueName: \"kubernetes.io/projected/f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9-kube-api-access-vm6rs\") pod \"auto-csr-approver-29565308-dzlmc\" (UID: \"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9\") " pod="openshift-infra/auto-csr-approver-29565308-dzlmc" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.399419 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vm6rs\" (UniqueName: \"kubernetes.io/projected/f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9-kube-api-access-vm6rs\") pod \"auto-csr-approver-29565308-dzlmc\" (UID: \"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9\") " pod="openshift-infra/auto-csr-approver-29565308-dzlmc" Mar 19 11:08:00 crc kubenswrapper[4765]: I0319 11:08:00.472322 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565308-dzlmc" Mar 19 11:08:15 crc kubenswrapper[4765]: E0319 11:08:15.198538 4765 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified" Mar 19 11:08:15 crc kubenswrapper[4765]: E0319 11:08:15.199285 4765 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:tempest-tests-tempest-tests-runner,Image:quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config-data,ReadOnly:false,MountPath:/etc/test_operator,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-workdir,ReadOnly:false,MountPath:/var/lib/tempest,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-ephemeral-temporary,ReadOnly:false,MountPath:/tmp,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:test-operator-logs,ReadOnly:false,MountPath:/var/lib/tempest/external_files,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/etc/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config,ReadOnly:true,MountPath:/var/lib/tempest/.config/openstack/clouds.yaml,SubPath:clouds.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:openstack-config-secret,ReadOnly:false,MountPath:/etc/openstack/secure.yaml,SubPath:secure.yaml,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ca-certs,ReadOnly:true,MountPath:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem,SubPath:tls-ca-bundle.pem,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:ssh-key,ReadOnly:false,MountPath:/var/lib/tempest/id_ecdsa,SubPath:ssh_key,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-4m2wf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*42480,RunAsNonRoot:*false,ReadOnlyRootFilesystem:*false,AllowPrivilegeEscalation:*true,RunAsGroup:*42480,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-custom-data-s0,},Optional:nil,},SecretRef:nil,},EnvFromSource{Prefix:,ConfigMapRef:&ConfigMapEnvSource{LocalObjectReference:LocalObjectReference{Name:tempest-tests-tempest-env-vars-s0,},Optional:nil,},SecretRef:nil,},},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod tempest-tests-tempest_openstack(65eabf0c-0a01-4d5b-aefd-d9ce064e1d66): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 19 11:08:15 crc kubenswrapper[4765]: E0319 11:08:15.201460 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/tempest-tests-tempest" podUID="65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" Mar 19 11:08:15 crc kubenswrapper[4765]: E0319 11:08:15.220399 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tempest-tests-tempest-tests-runner\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-tempest-all:current-podified\\\"\"" pod="openstack/tempest-tests-tempest" podUID="65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" Mar 19 11:08:15 crc kubenswrapper[4765]: I0319 11:08:15.624134 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565308-dzlmc"] Mar 19 11:08:15 crc kubenswrapper[4765]: I0319 11:08:15.626581 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:08:16 crc kubenswrapper[4765]: I0319 11:08:16.224147 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565308-dzlmc" event={"ID":"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9","Type":"ContainerStarted","Data":"ef6809786db684e39bda9237c38adf1b09cd71d4d16e5a46ecc479dd5e0a2b86"} Mar 19 11:08:18 crc kubenswrapper[4765]: I0319 11:08:18.241211 4765 generic.go:334] "Generic (PLEG): container finished" podID="f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9" containerID="6eec538347335f65fe4b39f439f3ac17e9baf92ed165e146936650fbae782220" exitCode=0 Mar 19 11:08:18 crc kubenswrapper[4765]: I0319 11:08:18.241323 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565308-dzlmc" event={"ID":"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9","Type":"ContainerDied","Data":"6eec538347335f65fe4b39f439f3ac17e9baf92ed165e146936650fbae782220"} Mar 19 11:08:19 crc kubenswrapper[4765]: I0319 11:08:19.617307 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565308-dzlmc" Mar 19 11:08:19 crc kubenswrapper[4765]: I0319 11:08:19.723454 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vm6rs\" (UniqueName: \"kubernetes.io/projected/f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9-kube-api-access-vm6rs\") pod \"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9\" (UID: \"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9\") " Mar 19 11:08:19 crc kubenswrapper[4765]: I0319 11:08:19.729007 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9-kube-api-access-vm6rs" (OuterVolumeSpecName: "kube-api-access-vm6rs") pod "f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9" (UID: "f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9"). InnerVolumeSpecName "kube-api-access-vm6rs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:08:19 crc kubenswrapper[4765]: I0319 11:08:19.826336 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vm6rs\" (UniqueName: \"kubernetes.io/projected/f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9-kube-api-access-vm6rs\") on node \"crc\" DevicePath \"\"" Mar 19 11:08:20 crc kubenswrapper[4765]: I0319 11:08:20.264900 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565308-dzlmc" event={"ID":"f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9","Type":"ContainerDied","Data":"ef6809786db684e39bda9237c38adf1b09cd71d4d16e5a46ecc479dd5e0a2b86"} Mar 19 11:08:20 crc kubenswrapper[4765]: I0319 11:08:20.265301 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ef6809786db684e39bda9237c38adf1b09cd71d4d16e5a46ecc479dd5e0a2b86" Mar 19 11:08:20 crc kubenswrapper[4765]: I0319 11:08:20.265024 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565308-dzlmc" Mar 19 11:08:20 crc kubenswrapper[4765]: I0319 11:08:20.692451 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565302-h4fq5"] Mar 19 11:08:20 crc kubenswrapper[4765]: I0319 11:08:20.698490 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565302-h4fq5"] Mar 19 11:08:22 crc kubenswrapper[4765]: I0319 11:08:22.370301 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15bb417d-3ab4-44df-a950-bbd7c17b289a" path="/var/lib/kubelet/pods/15bb417d-3ab4-44df-a950-bbd7c17b289a/volumes" Mar 19 11:08:29 crc kubenswrapper[4765]: I0319 11:08:29.352935 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66","Type":"ContainerStarted","Data":"1333137a4a2ef8a5895accf9377cfee0342716cde7a94be61c92b5f6d8908736"} Mar 19 11:08:29 crc kubenswrapper[4765]: I0319 11:08:29.373753 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/tempest-tests-tempest" podStartSLOduration=4.124926309 podStartE2EDuration="43.373734897s" podCreationTimestamp="2026-03-19 11:07:46 +0000 UTC" firstStartedPulling="2026-03-19 11:07:48.624037047 +0000 UTC m=+2766.972982589" lastFinishedPulling="2026-03-19 11:08:27.872845635 +0000 UTC m=+2806.221791177" observedRunningTime="2026-03-19 11:08:29.369003649 +0000 UTC m=+2807.717949201" watchObservedRunningTime="2026-03-19 11:08:29.373734897 +0000 UTC m=+2807.722680439" Mar 19 11:09:15 crc kubenswrapper[4765]: I0319 11:09:15.160557 4765 scope.go:117] "RemoveContainer" containerID="b9b178bcde609482bc72b19cd7b52e1399b2ddad472355f66e1f65a67d234566" Mar 19 11:09:31 crc kubenswrapper[4765]: I0319 11:09:31.656449 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:09:31 crc kubenswrapper[4765]: I0319 11:09:31.657366 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.146068 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565310-wd645"] Mar 19 11:10:00 crc kubenswrapper[4765]: E0319 11:10:00.147221 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9" containerName="oc" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.147240 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9" containerName="oc" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.147486 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9" containerName="oc" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.148221 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565310-wd645" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.150926 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.151031 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.151044 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.161830 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565310-wd645"] Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.219585 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtbtv\" (UniqueName: \"kubernetes.io/projected/80970b68-1570-4059-a9ff-2f5a09746ffa-kube-api-access-dtbtv\") pod \"auto-csr-approver-29565310-wd645\" (UID: \"80970b68-1570-4059-a9ff-2f5a09746ffa\") " pod="openshift-infra/auto-csr-approver-29565310-wd645" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.322585 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtbtv\" (UniqueName: \"kubernetes.io/projected/80970b68-1570-4059-a9ff-2f5a09746ffa-kube-api-access-dtbtv\") pod \"auto-csr-approver-29565310-wd645\" (UID: \"80970b68-1570-4059-a9ff-2f5a09746ffa\") " pod="openshift-infra/auto-csr-approver-29565310-wd645" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.341674 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtbtv\" (UniqueName: \"kubernetes.io/projected/80970b68-1570-4059-a9ff-2f5a09746ffa-kube-api-access-dtbtv\") pod \"auto-csr-approver-29565310-wd645\" (UID: \"80970b68-1570-4059-a9ff-2f5a09746ffa\") " pod="openshift-infra/auto-csr-approver-29565310-wd645" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.467361 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565310-wd645" Mar 19 11:10:00 crc kubenswrapper[4765]: I0319 11:10:00.914269 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565310-wd645"] Mar 19 11:10:00 crc kubenswrapper[4765]: W0319 11:10:00.916906 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod80970b68_1570_4059_a9ff_2f5a09746ffa.slice/crio-b7c49eb2f4a441e6f15f00753dd5ef18b511088236d9a7350f987bc956edfc9e WatchSource:0}: Error finding container b7c49eb2f4a441e6f15f00753dd5ef18b511088236d9a7350f987bc956edfc9e: Status 404 returned error can't find the container with id b7c49eb2f4a441e6f15f00753dd5ef18b511088236d9a7350f987bc956edfc9e Mar 19 11:10:01 crc kubenswrapper[4765]: I0319 11:10:01.278455 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565310-wd645" event={"ID":"80970b68-1570-4059-a9ff-2f5a09746ffa","Type":"ContainerStarted","Data":"b7c49eb2f4a441e6f15f00753dd5ef18b511088236d9a7350f987bc956edfc9e"} Mar 19 11:10:01 crc kubenswrapper[4765]: I0319 11:10:01.655865 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:10:01 crc kubenswrapper[4765]: I0319 11:10:01.656270 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:10:02 crc kubenswrapper[4765]: I0319 11:10:02.294791 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565310-wd645" event={"ID":"80970b68-1570-4059-a9ff-2f5a09746ffa","Type":"ContainerStarted","Data":"8d24b6e7edb4070fed2fedb70413ced82873f98d7856a21510d00e44e9d42db6"} Mar 19 11:10:02 crc kubenswrapper[4765]: I0319 11:10:02.316240 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565310-wd645" podStartSLOduration=1.425706026 podStartE2EDuration="2.31622246s" podCreationTimestamp="2026-03-19 11:10:00 +0000 UTC" firstStartedPulling="2026-03-19 11:10:00.919584638 +0000 UTC m=+2899.268530190" lastFinishedPulling="2026-03-19 11:10:01.810101082 +0000 UTC m=+2900.159046624" observedRunningTime="2026-03-19 11:10:02.310430204 +0000 UTC m=+2900.659375736" watchObservedRunningTime="2026-03-19 11:10:02.31622246 +0000 UTC m=+2900.665168002" Mar 19 11:10:03 crc kubenswrapper[4765]: I0319 11:10:03.305665 4765 generic.go:334] "Generic (PLEG): container finished" podID="80970b68-1570-4059-a9ff-2f5a09746ffa" containerID="8d24b6e7edb4070fed2fedb70413ced82873f98d7856a21510d00e44e9d42db6" exitCode=0 Mar 19 11:10:03 crc kubenswrapper[4765]: I0319 11:10:03.305758 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565310-wd645" event={"ID":"80970b68-1570-4059-a9ff-2f5a09746ffa","Type":"ContainerDied","Data":"8d24b6e7edb4070fed2fedb70413ced82873f98d7856a21510d00e44e9d42db6"} Mar 19 11:10:04 crc kubenswrapper[4765]: I0319 11:10:04.797996 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565310-wd645" Mar 19 11:10:04 crc kubenswrapper[4765]: I0319 11:10:04.926207 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtbtv\" (UniqueName: \"kubernetes.io/projected/80970b68-1570-4059-a9ff-2f5a09746ffa-kube-api-access-dtbtv\") pod \"80970b68-1570-4059-a9ff-2f5a09746ffa\" (UID: \"80970b68-1570-4059-a9ff-2f5a09746ffa\") " Mar 19 11:10:04 crc kubenswrapper[4765]: I0319 11:10:04.933538 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80970b68-1570-4059-a9ff-2f5a09746ffa-kube-api-access-dtbtv" (OuterVolumeSpecName: "kube-api-access-dtbtv") pod "80970b68-1570-4059-a9ff-2f5a09746ffa" (UID: "80970b68-1570-4059-a9ff-2f5a09746ffa"). InnerVolumeSpecName "kube-api-access-dtbtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:10:05 crc kubenswrapper[4765]: I0319 11:10:05.029206 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtbtv\" (UniqueName: \"kubernetes.io/projected/80970b68-1570-4059-a9ff-2f5a09746ffa-kube-api-access-dtbtv\") on node \"crc\" DevicePath \"\"" Mar 19 11:10:05 crc kubenswrapper[4765]: I0319 11:10:05.327414 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565310-wd645" event={"ID":"80970b68-1570-4059-a9ff-2f5a09746ffa","Type":"ContainerDied","Data":"b7c49eb2f4a441e6f15f00753dd5ef18b511088236d9a7350f987bc956edfc9e"} Mar 19 11:10:05 crc kubenswrapper[4765]: I0319 11:10:05.327472 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c49eb2f4a441e6f15f00753dd5ef18b511088236d9a7350f987bc956edfc9e" Mar 19 11:10:05 crc kubenswrapper[4765]: I0319 11:10:05.327499 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565310-wd645" Mar 19 11:10:05 crc kubenswrapper[4765]: I0319 11:10:05.380618 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565304-mvcqw"] Mar 19 11:10:05 crc kubenswrapper[4765]: I0319 11:10:05.391754 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565304-mvcqw"] Mar 19 11:10:06 crc kubenswrapper[4765]: I0319 11:10:06.388204 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c8c122d7-d33a-4c98-970e-be77ef4539e9" path="/var/lib/kubelet/pods/c8c122d7-d33a-4c98-970e-be77ef4539e9/volumes" Mar 19 11:10:15 crc kubenswrapper[4765]: I0319 11:10:15.268169 4765 scope.go:117] "RemoveContainer" containerID="7f8df658308dc6880e58feb2cacde21ea7c28617357ae46adae88ebbb6da430e" Mar 19 11:10:31 crc kubenswrapper[4765]: I0319 11:10:31.655648 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:10:31 crc kubenswrapper[4765]: I0319 11:10:31.656254 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:10:31 crc kubenswrapper[4765]: I0319 11:10:31.656307 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 11:10:31 crc kubenswrapper[4765]: I0319 11:10:31.656951 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"13204da80ca05cc175b015bf0b4bf2b868c7ec91fcec2cd7e027044b6cd5d8e2"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 11:10:31 crc kubenswrapper[4765]: I0319 11:10:31.657044 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://13204da80ca05cc175b015bf0b4bf2b868c7ec91fcec2cd7e027044b6cd5d8e2" gracePeriod=600 Mar 19 11:10:32 crc kubenswrapper[4765]: I0319 11:10:32.603946 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="13204da80ca05cc175b015bf0b4bf2b868c7ec91fcec2cd7e027044b6cd5d8e2" exitCode=0 Mar 19 11:10:32 crc kubenswrapper[4765]: I0319 11:10:32.604140 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"13204da80ca05cc175b015bf0b4bf2b868c7ec91fcec2cd7e027044b6cd5d8e2"} Mar 19 11:10:32 crc kubenswrapper[4765]: I0319 11:10:32.605247 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c"} Mar 19 11:10:32 crc kubenswrapper[4765]: I0319 11:10:32.605345 4765 scope.go:117] "RemoveContainer" containerID="b40f7a75c0b00dac6b1cf20ffc80a980d2d0550531032b4fd24d0a3f9b2ef903" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.159146 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565312-zb8d9"] Mar 19 11:12:00 crc kubenswrapper[4765]: E0319 11:12:00.160242 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="80970b68-1570-4059-a9ff-2f5a09746ffa" containerName="oc" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.160259 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="80970b68-1570-4059-a9ff-2f5a09746ffa" containerName="oc" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.160528 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="80970b68-1570-4059-a9ff-2f5a09746ffa" containerName="oc" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.161310 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.164394 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.164567 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.164702 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.168435 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565312-zb8d9"] Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.313081 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cj57c\" (UniqueName: \"kubernetes.io/projected/2c751965-eb03-4655-9abc-917dc9b5aeb1-kube-api-access-cj57c\") pod \"auto-csr-approver-29565312-zb8d9\" (UID: \"2c751965-eb03-4655-9abc-917dc9b5aeb1\") " pod="openshift-infra/auto-csr-approver-29565312-zb8d9" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.415677 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cj57c\" (UniqueName: \"kubernetes.io/projected/2c751965-eb03-4655-9abc-917dc9b5aeb1-kube-api-access-cj57c\") pod \"auto-csr-approver-29565312-zb8d9\" (UID: \"2c751965-eb03-4655-9abc-917dc9b5aeb1\") " pod="openshift-infra/auto-csr-approver-29565312-zb8d9" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.435569 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cj57c\" (UniqueName: \"kubernetes.io/projected/2c751965-eb03-4655-9abc-917dc9b5aeb1-kube-api-access-cj57c\") pod \"auto-csr-approver-29565312-zb8d9\" (UID: \"2c751965-eb03-4655-9abc-917dc9b5aeb1\") " pod="openshift-infra/auto-csr-approver-29565312-zb8d9" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.494416 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" Mar 19 11:12:00 crc kubenswrapper[4765]: I0319 11:12:00.965278 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565312-zb8d9"] Mar 19 11:12:01 crc kubenswrapper[4765]: I0319 11:12:01.440444 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" event={"ID":"2c751965-eb03-4655-9abc-917dc9b5aeb1","Type":"ContainerStarted","Data":"f6deff16bf7f9ef069692dbf46f6b3d5eb29d59ebfc9e51ba5ad5afd39db58f6"} Mar 19 11:12:02 crc kubenswrapper[4765]: I0319 11:12:02.449626 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" event={"ID":"2c751965-eb03-4655-9abc-917dc9b5aeb1","Type":"ContainerStarted","Data":"b1dab3a48fe59b251ff95c56340004eeffdec8bd31fe69fee86cdecc7c716676"} Mar 19 11:12:02 crc kubenswrapper[4765]: I0319 11:12:02.467480 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" podStartSLOduration=1.282133837 podStartE2EDuration="2.46745738s" podCreationTimestamp="2026-03-19 11:12:00 +0000 UTC" firstStartedPulling="2026-03-19 11:12:00.972754176 +0000 UTC m=+3019.321699718" lastFinishedPulling="2026-03-19 11:12:02.158077719 +0000 UTC m=+3020.507023261" observedRunningTime="2026-03-19 11:12:02.461767857 +0000 UTC m=+3020.810713399" watchObservedRunningTime="2026-03-19 11:12:02.46745738 +0000 UTC m=+3020.816402932" Mar 19 11:12:03 crc kubenswrapper[4765]: I0319 11:12:03.463118 4765 generic.go:334] "Generic (PLEG): container finished" podID="2c751965-eb03-4655-9abc-917dc9b5aeb1" containerID="b1dab3a48fe59b251ff95c56340004eeffdec8bd31fe69fee86cdecc7c716676" exitCode=0 Mar 19 11:12:03 crc kubenswrapper[4765]: I0319 11:12:03.463172 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" event={"ID":"2c751965-eb03-4655-9abc-917dc9b5aeb1","Type":"ContainerDied","Data":"b1dab3a48fe59b251ff95c56340004eeffdec8bd31fe69fee86cdecc7c716676"} Mar 19 11:12:04 crc kubenswrapper[4765]: I0319 11:12:04.859090 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.006152 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cj57c\" (UniqueName: \"kubernetes.io/projected/2c751965-eb03-4655-9abc-917dc9b5aeb1-kube-api-access-cj57c\") pod \"2c751965-eb03-4655-9abc-917dc9b5aeb1\" (UID: \"2c751965-eb03-4655-9abc-917dc9b5aeb1\") " Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.016086 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c751965-eb03-4655-9abc-917dc9b5aeb1-kube-api-access-cj57c" (OuterVolumeSpecName: "kube-api-access-cj57c") pod "2c751965-eb03-4655-9abc-917dc9b5aeb1" (UID: "2c751965-eb03-4655-9abc-917dc9b5aeb1"). InnerVolumeSpecName "kube-api-access-cj57c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.109925 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cj57c\" (UniqueName: \"kubernetes.io/projected/2c751965-eb03-4655-9abc-917dc9b5aeb1-kube-api-access-cj57c\") on node \"crc\" DevicePath \"\"" Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.438658 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565306-gplrg"] Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.447602 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565306-gplrg"] Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.484305 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" event={"ID":"2c751965-eb03-4655-9abc-917dc9b5aeb1","Type":"ContainerDied","Data":"f6deff16bf7f9ef069692dbf46f6b3d5eb29d59ebfc9e51ba5ad5afd39db58f6"} Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.484550 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f6deff16bf7f9ef069692dbf46f6b3d5eb29d59ebfc9e51ba5ad5afd39db58f6" Mar 19 11:12:05 crc kubenswrapper[4765]: I0319 11:12:05.484663 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565312-zb8d9" Mar 19 11:12:06 crc kubenswrapper[4765]: I0319 11:12:06.371073 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3034496b-4e9f-4e55-a348-691b747da728" path="/var/lib/kubelet/pods/3034496b-4e9f-4e55-a348-691b747da728/volumes" Mar 19 11:12:15 crc kubenswrapper[4765]: I0319 11:12:15.387824 4765 scope.go:117] "RemoveContainer" containerID="37797939112b7f4b03a51c87fc26899c1a25e98e160adea32ae3b7d6f3c3e4bd" Mar 19 11:13:01 crc kubenswrapper[4765]: I0319 11:13:01.656128 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:13:01 crc kubenswrapper[4765]: I0319 11:13:01.656776 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:13:31 crc kubenswrapper[4765]: I0319 11:13:31.655758 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:13:31 crc kubenswrapper[4765]: I0319 11:13:31.656341 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.149636 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565314-f4dpf"] Mar 19 11:14:00 crc kubenswrapper[4765]: E0319 11:14:00.150635 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c751965-eb03-4655-9abc-917dc9b5aeb1" containerName="oc" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.150650 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c751965-eb03-4655-9abc-917dc9b5aeb1" containerName="oc" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.150867 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c751965-eb03-4655-9abc-917dc9b5aeb1" containerName="oc" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.151635 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565314-f4dpf" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.153639 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.154407 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.155712 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.170579 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565314-f4dpf"] Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.223745 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pkw7\" (UniqueName: \"kubernetes.io/projected/1135c796-c575-429b-a789-93d9f8d093f3-kube-api-access-4pkw7\") pod \"auto-csr-approver-29565314-f4dpf\" (UID: \"1135c796-c575-429b-a789-93d9f8d093f3\") " pod="openshift-infra/auto-csr-approver-29565314-f4dpf" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.324655 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4pkw7\" (UniqueName: \"kubernetes.io/projected/1135c796-c575-429b-a789-93d9f8d093f3-kube-api-access-4pkw7\") pod \"auto-csr-approver-29565314-f4dpf\" (UID: \"1135c796-c575-429b-a789-93d9f8d093f3\") " pod="openshift-infra/auto-csr-approver-29565314-f4dpf" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.352794 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pkw7\" (UniqueName: \"kubernetes.io/projected/1135c796-c575-429b-a789-93d9f8d093f3-kube-api-access-4pkw7\") pod \"auto-csr-approver-29565314-f4dpf\" (UID: \"1135c796-c575-429b-a789-93d9f8d093f3\") " pod="openshift-infra/auto-csr-approver-29565314-f4dpf" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.470714 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565314-f4dpf" Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.943487 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565314-f4dpf"] Mar 19 11:14:00 crc kubenswrapper[4765]: I0319 11:14:00.946576 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:14:01 crc kubenswrapper[4765]: I0319 11:14:01.035497 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565314-f4dpf" event={"ID":"1135c796-c575-429b-a789-93d9f8d093f3","Type":"ContainerStarted","Data":"11b289aea3350bc4c98e595ba3c68f5b87d57f685ad38968077cbd1a90cf2ded"} Mar 19 11:14:01 crc kubenswrapper[4765]: I0319 11:14:01.656292 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:14:01 crc kubenswrapper[4765]: I0319 11:14:01.656352 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:14:01 crc kubenswrapper[4765]: I0319 11:14:01.656396 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 11:14:01 crc kubenswrapper[4765]: I0319 11:14:01.656936 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 11:14:01 crc kubenswrapper[4765]: I0319 11:14:01.657037 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" gracePeriod=600 Mar 19 11:14:01 crc kubenswrapper[4765]: E0319 11:14:01.781449 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.027983 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-btz2d"] Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.031034 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.059556 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btz2d"] Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.069440 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z64w6\" (UniqueName: \"kubernetes.io/projected/6ee05e22-7b10-46ec-8f01-027daa3f02fd-kube-api-access-z64w6\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.069751 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-utilities\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.070742 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-catalog-content\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.073450 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" exitCode=0 Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.073551 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c"} Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.073659 4765 scope.go:117] "RemoveContainer" containerID="13204da80ca05cc175b015bf0b4bf2b868c7ec91fcec2cd7e027044b6cd5d8e2" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.075948 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:14:02 crc kubenswrapper[4765]: E0319 11:14:02.077412 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.173528 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z64w6\" (UniqueName: \"kubernetes.io/projected/6ee05e22-7b10-46ec-8f01-027daa3f02fd-kube-api-access-z64w6\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.174159 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-utilities\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.174450 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-catalog-content\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.177385 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-utilities\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.181864 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-catalog-content\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.204924 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z64w6\" (UniqueName: \"kubernetes.io/projected/6ee05e22-7b10-46ec-8f01-027daa3f02fd-kube-api-access-z64w6\") pod \"redhat-marketplace-btz2d\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.352400 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:02 crc kubenswrapper[4765]: I0319 11:14:02.819516 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-btz2d"] Mar 19 11:14:03 crc kubenswrapper[4765]: I0319 11:14:03.081441 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerID="303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325" exitCode=0 Mar 19 11:14:03 crc kubenswrapper[4765]: I0319 11:14:03.081487 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btz2d" event={"ID":"6ee05e22-7b10-46ec-8f01-027daa3f02fd","Type":"ContainerDied","Data":"303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325"} Mar 19 11:14:03 crc kubenswrapper[4765]: I0319 11:14:03.081759 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btz2d" event={"ID":"6ee05e22-7b10-46ec-8f01-027daa3f02fd","Type":"ContainerStarted","Data":"1161fa79df535d31ea80688d19e3847bad87c61c78c6376882195fb4788b914e"} Mar 19 11:14:05 crc kubenswrapper[4765]: I0319 11:14:05.110091 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerID="1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c" exitCode=0 Mar 19 11:14:05 crc kubenswrapper[4765]: I0319 11:14:05.110348 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btz2d" event={"ID":"6ee05e22-7b10-46ec-8f01-027daa3f02fd","Type":"ContainerDied","Data":"1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c"} Mar 19 11:14:06 crc kubenswrapper[4765]: I0319 11:14:06.121583 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btz2d" event={"ID":"6ee05e22-7b10-46ec-8f01-027daa3f02fd","Type":"ContainerStarted","Data":"bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4"} Mar 19 11:14:06 crc kubenswrapper[4765]: I0319 11:14:06.150405 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-btz2d" podStartSLOduration=1.683358239 podStartE2EDuration="4.150384149s" podCreationTimestamp="2026-03-19 11:14:02 +0000 UTC" firstStartedPulling="2026-03-19 11:14:03.083107487 +0000 UTC m=+3141.432053029" lastFinishedPulling="2026-03-19 11:14:05.550133397 +0000 UTC m=+3143.899078939" observedRunningTime="2026-03-19 11:14:06.143200005 +0000 UTC m=+3144.492145557" watchObservedRunningTime="2026-03-19 11:14:06.150384149 +0000 UTC m=+3144.499329701" Mar 19 11:14:07 crc kubenswrapper[4765]: I0319 11:14:07.133478 4765 generic.go:334] "Generic (PLEG): container finished" podID="1135c796-c575-429b-a789-93d9f8d093f3" containerID="b5243380a142956123d3cd0bcaac9cdd8e40ce6dbb344bd110ab0e9909fd0a89" exitCode=0 Mar 19 11:14:07 crc kubenswrapper[4765]: I0319 11:14:07.133585 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565314-f4dpf" event={"ID":"1135c796-c575-429b-a789-93d9f8d093f3","Type":"ContainerDied","Data":"b5243380a142956123d3cd0bcaac9cdd8e40ce6dbb344bd110ab0e9909fd0a89"} Mar 19 11:14:08 crc kubenswrapper[4765]: I0319 11:14:08.558835 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565314-f4dpf" Mar 19 11:14:08 crc kubenswrapper[4765]: I0319 11:14:08.697201 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4pkw7\" (UniqueName: \"kubernetes.io/projected/1135c796-c575-429b-a789-93d9f8d093f3-kube-api-access-4pkw7\") pod \"1135c796-c575-429b-a789-93d9f8d093f3\" (UID: \"1135c796-c575-429b-a789-93d9f8d093f3\") " Mar 19 11:14:08 crc kubenswrapper[4765]: I0319 11:14:08.703700 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1135c796-c575-429b-a789-93d9f8d093f3-kube-api-access-4pkw7" (OuterVolumeSpecName: "kube-api-access-4pkw7") pod "1135c796-c575-429b-a789-93d9f8d093f3" (UID: "1135c796-c575-429b-a789-93d9f8d093f3"). InnerVolumeSpecName "kube-api-access-4pkw7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:14:08 crc kubenswrapper[4765]: I0319 11:14:08.803575 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4pkw7\" (UniqueName: \"kubernetes.io/projected/1135c796-c575-429b-a789-93d9f8d093f3-kube-api-access-4pkw7\") on node \"crc\" DevicePath \"\"" Mar 19 11:14:09 crc kubenswrapper[4765]: I0319 11:14:09.160992 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565314-f4dpf" event={"ID":"1135c796-c575-429b-a789-93d9f8d093f3","Type":"ContainerDied","Data":"11b289aea3350bc4c98e595ba3c68f5b87d57f685ad38968077cbd1a90cf2ded"} Mar 19 11:14:09 crc kubenswrapper[4765]: I0319 11:14:09.161039 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="11b289aea3350bc4c98e595ba3c68f5b87d57f685ad38968077cbd1a90cf2ded" Mar 19 11:14:09 crc kubenswrapper[4765]: I0319 11:14:09.161052 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565314-f4dpf" Mar 19 11:14:09 crc kubenswrapper[4765]: I0319 11:14:09.629206 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565308-dzlmc"] Mar 19 11:14:09 crc kubenswrapper[4765]: I0319 11:14:09.640391 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565308-dzlmc"] Mar 19 11:14:10 crc kubenswrapper[4765]: I0319 11:14:10.368708 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9" path="/var/lib/kubelet/pods/f327f8ad-f6c5-41cc-9e21-95b3c93ee9f9/volumes" Mar 19 11:14:12 crc kubenswrapper[4765]: I0319 11:14:12.352534 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:12 crc kubenswrapper[4765]: I0319 11:14:12.352806 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:12 crc kubenswrapper[4765]: I0319 11:14:12.405619 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:13 crc kubenswrapper[4765]: I0319 11:14:13.238723 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:13 crc kubenswrapper[4765]: I0319 11:14:13.281747 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btz2d"] Mar 19 11:14:14 crc kubenswrapper[4765]: I0319 11:14:14.356689 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:14:14 crc kubenswrapper[4765]: E0319 11:14:14.357007 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.214558 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-btz2d" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="registry-server" containerID="cri-o://bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4" gracePeriod=2 Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.749676 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.833118 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z64w6\" (UniqueName: \"kubernetes.io/projected/6ee05e22-7b10-46ec-8f01-027daa3f02fd-kube-api-access-z64w6\") pod \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.833547 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-catalog-content\") pod \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.833685 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-utilities\") pod \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\" (UID: \"6ee05e22-7b10-46ec-8f01-027daa3f02fd\") " Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.835093 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-utilities" (OuterVolumeSpecName: "utilities") pod "6ee05e22-7b10-46ec-8f01-027daa3f02fd" (UID: "6ee05e22-7b10-46ec-8f01-027daa3f02fd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.845771 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ee05e22-7b10-46ec-8f01-027daa3f02fd-kube-api-access-z64w6" (OuterVolumeSpecName: "kube-api-access-z64w6") pod "6ee05e22-7b10-46ec-8f01-027daa3f02fd" (UID: "6ee05e22-7b10-46ec-8f01-027daa3f02fd"). InnerVolumeSpecName "kube-api-access-z64w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.859377 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6ee05e22-7b10-46ec-8f01-027daa3f02fd" (UID: "6ee05e22-7b10-46ec-8f01-027daa3f02fd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.936098 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.936381 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6ee05e22-7b10-46ec-8f01-027daa3f02fd-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:14:15 crc kubenswrapper[4765]: I0319 11:14:15.936478 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z64w6\" (UniqueName: \"kubernetes.io/projected/6ee05e22-7b10-46ec-8f01-027daa3f02fd-kube-api-access-z64w6\") on node \"crc\" DevicePath \"\"" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.223580 4765 generic.go:334] "Generic (PLEG): container finished" podID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerID="bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4" exitCode=0 Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.223644 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-btz2d" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.223641 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btz2d" event={"ID":"6ee05e22-7b10-46ec-8f01-027daa3f02fd","Type":"ContainerDied","Data":"bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4"} Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.223705 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-btz2d" event={"ID":"6ee05e22-7b10-46ec-8f01-027daa3f02fd","Type":"ContainerDied","Data":"1161fa79df535d31ea80688d19e3847bad87c61c78c6376882195fb4788b914e"} Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.223732 4765 scope.go:117] "RemoveContainer" containerID="bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.243390 4765 scope.go:117] "RemoveContainer" containerID="1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.261381 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-btz2d"] Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.274382 4765 scope.go:117] "RemoveContainer" containerID="303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.275292 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-btz2d"] Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.309619 4765 scope.go:117] "RemoveContainer" containerID="bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4" Mar 19 11:14:16 crc kubenswrapper[4765]: E0319 11:14:16.310146 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4\": container with ID starting with bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4 not found: ID does not exist" containerID="bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.310189 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4"} err="failed to get container status \"bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4\": rpc error: code = NotFound desc = could not find container \"bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4\": container with ID starting with bf47375733588e15bf40a3273f6e3ebab4766e238c45b6369333ce555ad672a4 not found: ID does not exist" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.310217 4765 scope.go:117] "RemoveContainer" containerID="1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c" Mar 19 11:14:16 crc kubenswrapper[4765]: E0319 11:14:16.310512 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c\": container with ID starting with 1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c not found: ID does not exist" containerID="1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.310545 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c"} err="failed to get container status \"1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c\": rpc error: code = NotFound desc = could not find container \"1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c\": container with ID starting with 1cadb3cc4bb706a02deb3d6eef289842a545b8c8574141671be8913951aa008c not found: ID does not exist" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.310567 4765 scope.go:117] "RemoveContainer" containerID="303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325" Mar 19 11:14:16 crc kubenswrapper[4765]: E0319 11:14:16.310873 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325\": container with ID starting with 303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325 not found: ID does not exist" containerID="303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.310892 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325"} err="failed to get container status \"303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325\": rpc error: code = NotFound desc = could not find container \"303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325\": container with ID starting with 303307f8f11fe973d163281053dff9da30a16d503569ed1d6f39e2df8f07e325 not found: ID does not exist" Mar 19 11:14:16 crc kubenswrapper[4765]: I0319 11:14:16.367307 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" path="/var/lib/kubelet/pods/6ee05e22-7b10-46ec-8f01-027daa3f02fd/volumes" Mar 19 11:14:26 crc kubenswrapper[4765]: I0319 11:14:26.356415 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:14:26 crc kubenswrapper[4765]: E0319 11:14:26.357236 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:14:39 crc kubenswrapper[4765]: I0319 11:14:39.356469 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:14:39 crc kubenswrapper[4765]: E0319 11:14:39.357160 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:14:52 crc kubenswrapper[4765]: I0319 11:14:52.370817 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:14:52 crc kubenswrapper[4765]: E0319 11:14:52.373253 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.142581 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr"] Mar 19 11:15:00 crc kubenswrapper[4765]: E0319 11:15:00.143595 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="registry-server" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.143612 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="registry-server" Mar 19 11:15:00 crc kubenswrapper[4765]: E0319 11:15:00.143644 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1135c796-c575-429b-a789-93d9f8d093f3" containerName="oc" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.143658 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="1135c796-c575-429b-a789-93d9f8d093f3" containerName="oc" Mar 19 11:15:00 crc kubenswrapper[4765]: E0319 11:15:00.143670 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="extract-content" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.143680 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="extract-content" Mar 19 11:15:00 crc kubenswrapper[4765]: E0319 11:15:00.143701 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="extract-utilities" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.143709 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="extract-utilities" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.143988 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ee05e22-7b10-46ec-8f01-027daa3f02fd" containerName="registry-server" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.144004 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="1135c796-c575-429b-a789-93d9f8d093f3" containerName="oc" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.144720 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.150060 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.150060 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.156554 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr"] Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.216011 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9qjtr\" (UniqueName: \"kubernetes.io/projected/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-kube-api-access-9qjtr\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.216216 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-config-volume\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.216258 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-secret-volume\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.318276 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-config-volume\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.318322 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-secret-volume\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.318422 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9qjtr\" (UniqueName: \"kubernetes.io/projected/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-kube-api-access-9qjtr\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.319341 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-config-volume\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.323714 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-secret-volume\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.336732 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9qjtr\" (UniqueName: \"kubernetes.io/projected/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-kube-api-access-9qjtr\") pod \"collect-profiles-29565315-9xrwr\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.463937 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:00 crc kubenswrapper[4765]: I0319 11:15:00.966608 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr"] Mar 19 11:15:01 crc kubenswrapper[4765]: I0319 11:15:01.662352 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" event={"ID":"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9","Type":"ContainerStarted","Data":"04e2598dba0e971414a340227c2ffa16dd9cdeb65edc5a9e3690d9887cf5d2bf"} Mar 19 11:15:01 crc kubenswrapper[4765]: I0319 11:15:01.663913 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" event={"ID":"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9","Type":"ContainerStarted","Data":"51f02df8bece641d4437cac7d40a2f348f8298931504fb59f4621220f2ed41f5"} Mar 19 11:15:01 crc kubenswrapper[4765]: I0319 11:15:01.682094 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" podStartSLOduration=1.6820686729999998 podStartE2EDuration="1.682068673s" podCreationTimestamp="2026-03-19 11:15:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:15:01.677249893 +0000 UTC m=+3200.026195445" watchObservedRunningTime="2026-03-19 11:15:01.682068673 +0000 UTC m=+3200.031014215" Mar 19 11:15:02 crc kubenswrapper[4765]: I0319 11:15:02.672255 4765 generic.go:334] "Generic (PLEG): container finished" podID="5f5c23bc-84a5-472a-a9b8-7e62791ad5e9" containerID="04e2598dba0e971414a340227c2ffa16dd9cdeb65edc5a9e3690d9887cf5d2bf" exitCode=0 Mar 19 11:15:02 crc kubenswrapper[4765]: I0319 11:15:02.672308 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" event={"ID":"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9","Type":"ContainerDied","Data":"04e2598dba0e971414a340227c2ffa16dd9cdeb65edc5a9e3690d9887cf5d2bf"} Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.085370 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.193547 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-secret-volume\") pod \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.193916 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-config-volume\") pod \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.194118 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9qjtr\" (UniqueName: \"kubernetes.io/projected/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-kube-api-access-9qjtr\") pod \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\" (UID: \"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9\") " Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.194435 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-config-volume" (OuterVolumeSpecName: "config-volume") pod "5f5c23bc-84a5-472a-a9b8-7e62791ad5e9" (UID: "5f5c23bc-84a5-472a-a9b8-7e62791ad5e9"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.195118 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.199921 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-kube-api-access-9qjtr" (OuterVolumeSpecName: "kube-api-access-9qjtr") pod "5f5c23bc-84a5-472a-a9b8-7e62791ad5e9" (UID: "5f5c23bc-84a5-472a-a9b8-7e62791ad5e9"). InnerVolumeSpecName "kube-api-access-9qjtr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.200566 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "5f5c23bc-84a5-472a-a9b8-7e62791ad5e9" (UID: "5f5c23bc-84a5-472a-a9b8-7e62791ad5e9"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.296803 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.296839 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9qjtr\" (UniqueName: \"kubernetes.io/projected/5f5c23bc-84a5-472a-a9b8-7e62791ad5e9-kube-api-access-9qjtr\") on node \"crc\" DevicePath \"\"" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.690147 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" event={"ID":"5f5c23bc-84a5-472a-a9b8-7e62791ad5e9","Type":"ContainerDied","Data":"51f02df8bece641d4437cac7d40a2f348f8298931504fb59f4621220f2ed41f5"} Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.690408 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="51f02df8bece641d4437cac7d40a2f348f8298931504fb59f4621220f2ed41f5" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.690236 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565315-9xrwr" Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.806054 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796"] Mar 19 11:15:04 crc kubenswrapper[4765]: I0319 11:15:04.820301 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565270-9c796"] Mar 19 11:15:06 crc kubenswrapper[4765]: I0319 11:15:06.356098 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:15:06 crc kubenswrapper[4765]: E0319 11:15:06.357926 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:15:06 crc kubenswrapper[4765]: I0319 11:15:06.375190 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b8d3c9f-2553-44cf-971d-27dec0e5f66e" path="/var/lib/kubelet/pods/6b8d3c9f-2553-44cf-971d-27dec0e5f66e/volumes" Mar 19 11:15:15 crc kubenswrapper[4765]: I0319 11:15:15.502499 4765 scope.go:117] "RemoveContainer" containerID="6eec538347335f65fe4b39f439f3ac17e9baf92ed165e146936650fbae782220" Mar 19 11:15:15 crc kubenswrapper[4765]: I0319 11:15:15.548163 4765 scope.go:117] "RemoveContainer" containerID="49e18886eb0d4d2303675edd42e0a64d9cd5e3bee8e0e35fef0dda43405fcb89" Mar 19 11:15:18 crc kubenswrapper[4765]: I0319 11:15:18.356476 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:15:18 crc kubenswrapper[4765]: E0319 11:15:18.357874 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:15:32 crc kubenswrapper[4765]: I0319 11:15:32.363240 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:15:32 crc kubenswrapper[4765]: E0319 11:15:32.364174 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:15:47 crc kubenswrapper[4765]: I0319 11:15:47.356643 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:15:47 crc kubenswrapper[4765]: E0319 11:15:47.358175 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.162582 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565316-nw4lf"] Mar 19 11:16:00 crc kubenswrapper[4765]: E0319 11:16:00.163808 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f5c23bc-84a5-472a-a9b8-7e62791ad5e9" containerName="collect-profiles" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.163829 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f5c23bc-84a5-472a-a9b8-7e62791ad5e9" containerName="collect-profiles" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.164186 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f5c23bc-84a5-472a-a9b8-7e62791ad5e9" containerName="collect-profiles" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.165188 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565316-nw4lf" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.171311 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.171736 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.173132 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.176670 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565316-nw4lf"] Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.256418 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g7wlp\" (UniqueName: \"kubernetes.io/projected/96556ca1-43dd-4913-b94c-48ef7d7d00b0-kube-api-access-g7wlp\") pod \"auto-csr-approver-29565316-nw4lf\" (UID: \"96556ca1-43dd-4913-b94c-48ef7d7d00b0\") " pod="openshift-infra/auto-csr-approver-29565316-nw4lf" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.358376 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g7wlp\" (UniqueName: \"kubernetes.io/projected/96556ca1-43dd-4913-b94c-48ef7d7d00b0-kube-api-access-g7wlp\") pod \"auto-csr-approver-29565316-nw4lf\" (UID: \"96556ca1-43dd-4913-b94c-48ef7d7d00b0\") " pod="openshift-infra/auto-csr-approver-29565316-nw4lf" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.381239 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g7wlp\" (UniqueName: \"kubernetes.io/projected/96556ca1-43dd-4913-b94c-48ef7d7d00b0-kube-api-access-g7wlp\") pod \"auto-csr-approver-29565316-nw4lf\" (UID: \"96556ca1-43dd-4913-b94c-48ef7d7d00b0\") " pod="openshift-infra/auto-csr-approver-29565316-nw4lf" Mar 19 11:16:00 crc kubenswrapper[4765]: I0319 11:16:00.510152 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565316-nw4lf" Mar 19 11:16:01 crc kubenswrapper[4765]: I0319 11:16:01.001404 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565316-nw4lf"] Mar 19 11:16:01 crc kubenswrapper[4765]: I0319 11:16:01.225881 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565316-nw4lf" event={"ID":"96556ca1-43dd-4913-b94c-48ef7d7d00b0","Type":"ContainerStarted","Data":"cb24a689727913026dc101d2e33f1ff38807bc03e9b00b62aaf7034b467ee7f0"} Mar 19 11:16:01 crc kubenswrapper[4765]: I0319 11:16:01.357171 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:16:01 crc kubenswrapper[4765]: E0319 11:16:01.357599 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:16:03 crc kubenswrapper[4765]: I0319 11:16:03.254327 4765 generic.go:334] "Generic (PLEG): container finished" podID="96556ca1-43dd-4913-b94c-48ef7d7d00b0" containerID="310dd71db96197a0ac359aa87454e087fd00caa995716204cd037a442e9fa1a5" exitCode=0 Mar 19 11:16:03 crc kubenswrapper[4765]: I0319 11:16:03.254387 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565316-nw4lf" event={"ID":"96556ca1-43dd-4913-b94c-48ef7d7d00b0","Type":"ContainerDied","Data":"310dd71db96197a0ac359aa87454e087fd00caa995716204cd037a442e9fa1a5"} Mar 19 11:16:04 crc kubenswrapper[4765]: I0319 11:16:04.711467 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565316-nw4lf" Mar 19 11:16:04 crc kubenswrapper[4765]: I0319 11:16:04.745092 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g7wlp\" (UniqueName: \"kubernetes.io/projected/96556ca1-43dd-4913-b94c-48ef7d7d00b0-kube-api-access-g7wlp\") pod \"96556ca1-43dd-4913-b94c-48ef7d7d00b0\" (UID: \"96556ca1-43dd-4913-b94c-48ef7d7d00b0\") " Mar 19 11:16:04 crc kubenswrapper[4765]: I0319 11:16:04.755098 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96556ca1-43dd-4913-b94c-48ef7d7d00b0-kube-api-access-g7wlp" (OuterVolumeSpecName: "kube-api-access-g7wlp") pod "96556ca1-43dd-4913-b94c-48ef7d7d00b0" (UID: "96556ca1-43dd-4913-b94c-48ef7d7d00b0"). InnerVolumeSpecName "kube-api-access-g7wlp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:16:04 crc kubenswrapper[4765]: I0319 11:16:04.848315 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g7wlp\" (UniqueName: \"kubernetes.io/projected/96556ca1-43dd-4913-b94c-48ef7d7d00b0-kube-api-access-g7wlp\") on node \"crc\" DevicePath \"\"" Mar 19 11:16:05 crc kubenswrapper[4765]: I0319 11:16:05.286128 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565316-nw4lf" event={"ID":"96556ca1-43dd-4913-b94c-48ef7d7d00b0","Type":"ContainerDied","Data":"cb24a689727913026dc101d2e33f1ff38807bc03e9b00b62aaf7034b467ee7f0"} Mar 19 11:16:05 crc kubenswrapper[4765]: I0319 11:16:05.286168 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="cb24a689727913026dc101d2e33f1ff38807bc03e9b00b62aaf7034b467ee7f0" Mar 19 11:16:05 crc kubenswrapper[4765]: I0319 11:16:05.286217 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565316-nw4lf" Mar 19 11:16:05 crc kubenswrapper[4765]: I0319 11:16:05.784797 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565310-wd645"] Mar 19 11:16:05 crc kubenswrapper[4765]: I0319 11:16:05.795225 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565310-wd645"] Mar 19 11:16:06 crc kubenswrapper[4765]: I0319 11:16:06.367761 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80970b68-1570-4059-a9ff-2f5a09746ffa" path="/var/lib/kubelet/pods/80970b68-1570-4059-a9ff-2f5a09746ffa/volumes" Mar 19 11:16:13 crc kubenswrapper[4765]: I0319 11:16:13.356809 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:16:13 crc kubenswrapper[4765]: E0319 11:16:13.357644 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:16:15 crc kubenswrapper[4765]: I0319 11:16:15.634134 4765 scope.go:117] "RemoveContainer" containerID="8d24b6e7edb4070fed2fedb70413ced82873f98d7856a21510d00e44e9d42db6" Mar 19 11:16:28 crc kubenswrapper[4765]: I0319 11:16:28.358378 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:16:28 crc kubenswrapper[4765]: E0319 11:16:28.359223 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:16:40 crc kubenswrapper[4765]: I0319 11:16:40.356568 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:16:40 crc kubenswrapper[4765]: E0319 11:16:40.357377 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:16:52 crc kubenswrapper[4765]: I0319 11:16:52.362501 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:16:52 crc kubenswrapper[4765]: E0319 11:16:52.363752 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:17:04 crc kubenswrapper[4765]: I0319 11:17:04.918898 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-7cg5x"] Mar 19 11:17:04 crc kubenswrapper[4765]: E0319 11:17:04.919946 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96556ca1-43dd-4913-b94c-48ef7d7d00b0" containerName="oc" Mar 19 11:17:04 crc kubenswrapper[4765]: I0319 11:17:04.919984 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="96556ca1-43dd-4913-b94c-48ef7d7d00b0" containerName="oc" Mar 19 11:17:04 crc kubenswrapper[4765]: I0319 11:17:04.920250 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="96556ca1-43dd-4913-b94c-48ef7d7d00b0" containerName="oc" Mar 19 11:17:04 crc kubenswrapper[4765]: I0319 11:17:04.922194 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:04 crc kubenswrapper[4765]: I0319 11:17:04.945521 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7cg5x"] Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.041813 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-catalog-content\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.041924 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bw5hv\" (UniqueName: \"kubernetes.io/projected/a09270b5-4e86-4bc1-bd38-b5d72417e555-kube-api-access-bw5hv\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.042065 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-utilities\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.143989 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-catalog-content\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.144077 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bw5hv\" (UniqueName: \"kubernetes.io/projected/a09270b5-4e86-4bc1-bd38-b5d72417e555-kube-api-access-bw5hv\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.144119 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-utilities\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.144704 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-catalog-content\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.144774 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-utilities\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.167532 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bw5hv\" (UniqueName: \"kubernetes.io/projected/a09270b5-4e86-4bc1-bd38-b5d72417e555-kube-api-access-bw5hv\") pod \"certified-operators-7cg5x\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.246970 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.356872 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:17:05 crc kubenswrapper[4765]: E0319 11:17:05.357336 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.710841 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-7cg5x"] Mar 19 11:17:05 crc kubenswrapper[4765]: W0319 11:17:05.712727 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda09270b5_4e86_4bc1_bd38_b5d72417e555.slice/crio-60e617e22426d08953a2b2d8a345a5079b8ab4c018b914f05f3ee0e22b41b8e6 WatchSource:0}: Error finding container 60e617e22426d08953a2b2d8a345a5079b8ab4c018b914f05f3ee0e22b41b8e6: Status 404 returned error can't find the container with id 60e617e22426d08953a2b2d8a345a5079b8ab4c018b914f05f3ee0e22b41b8e6 Mar 19 11:17:05 crc kubenswrapper[4765]: I0319 11:17:05.800157 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cg5x" event={"ID":"a09270b5-4e86-4bc1-bd38-b5d72417e555","Type":"ContainerStarted","Data":"60e617e22426d08953a2b2d8a345a5079b8ab4c018b914f05f3ee0e22b41b8e6"} Mar 19 11:17:06 crc kubenswrapper[4765]: I0319 11:17:06.808373 4765 generic.go:334] "Generic (PLEG): container finished" podID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerID="53e72870ea967452de246c8ce49d2fe8ae946e2bdce02cc41f2958cc0adc7293" exitCode=0 Mar 19 11:17:06 crc kubenswrapper[4765]: I0319 11:17:06.808485 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cg5x" event={"ID":"a09270b5-4e86-4bc1-bd38-b5d72417e555","Type":"ContainerDied","Data":"53e72870ea967452de246c8ce49d2fe8ae946e2bdce02cc41f2958cc0adc7293"} Mar 19 11:17:07 crc kubenswrapper[4765]: I0319 11:17:07.828033 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cg5x" event={"ID":"a09270b5-4e86-4bc1-bd38-b5d72417e555","Type":"ContainerStarted","Data":"b9074dffc3b1236de629aaf6b23cb8bac66c5a6833dba52aa4fd734b4878dba4"} Mar 19 11:17:10 crc kubenswrapper[4765]: I0319 11:17:10.852807 4765 generic.go:334] "Generic (PLEG): container finished" podID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerID="b9074dffc3b1236de629aaf6b23cb8bac66c5a6833dba52aa4fd734b4878dba4" exitCode=0 Mar 19 11:17:10 crc kubenswrapper[4765]: I0319 11:17:10.852890 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cg5x" event={"ID":"a09270b5-4e86-4bc1-bd38-b5d72417e555","Type":"ContainerDied","Data":"b9074dffc3b1236de629aaf6b23cb8bac66c5a6833dba52aa4fd734b4878dba4"} Mar 19 11:17:11 crc kubenswrapper[4765]: I0319 11:17:11.865563 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cg5x" event={"ID":"a09270b5-4e86-4bc1-bd38-b5d72417e555","Type":"ContainerStarted","Data":"f13db9347125e4e9de80900bef1b3cab46d335f72c5d7b5fe1ac4b0ddd93559c"} Mar 19 11:17:11 crc kubenswrapper[4765]: I0319 11:17:11.890619 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-7cg5x" podStartSLOduration=3.302083966 podStartE2EDuration="7.890596634s" podCreationTimestamp="2026-03-19 11:17:04 +0000 UTC" firstStartedPulling="2026-03-19 11:17:06.810521439 +0000 UTC m=+3325.159466981" lastFinishedPulling="2026-03-19 11:17:11.399034107 +0000 UTC m=+3329.747979649" observedRunningTime="2026-03-19 11:17:11.889167936 +0000 UTC m=+3330.238113498" watchObservedRunningTime="2026-03-19 11:17:11.890596634 +0000 UTC m=+3330.239542176" Mar 19 11:17:15 crc kubenswrapper[4765]: I0319 11:17:15.248747 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:15 crc kubenswrapper[4765]: I0319 11:17:15.249335 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:16 crc kubenswrapper[4765]: I0319 11:17:16.295264 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-7cg5x" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="registry-server" probeResult="failure" output=< Mar 19 11:17:16 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 11:17:16 crc kubenswrapper[4765]: > Mar 19 11:17:20 crc kubenswrapper[4765]: I0319 11:17:20.356896 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:17:20 crc kubenswrapper[4765]: E0319 11:17:20.359037 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.309938 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-wn8c4"] Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.312550 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.323783 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wn8c4"] Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.451043 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qd6q7\" (UniqueName: \"kubernetes.io/projected/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-kube-api-access-qd6q7\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.451120 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-catalog-content\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.451155 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-utilities\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.553062 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-utilities\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.553298 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qd6q7\" (UniqueName: \"kubernetes.io/projected/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-kube-api-access-qd6q7\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.553346 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-catalog-content\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.553674 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-utilities\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.553749 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-catalog-content\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.572498 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qd6q7\" (UniqueName: \"kubernetes.io/projected/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-kube-api-access-qd6q7\") pod \"community-operators-wn8c4\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:21 crc kubenswrapper[4765]: I0319 11:17:21.632195 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:22 crc kubenswrapper[4765]: I0319 11:17:22.256000 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-wn8c4"] Mar 19 11:17:22 crc kubenswrapper[4765]: I0319 11:17:22.975535 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerID="5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572" exitCode=0 Mar 19 11:17:22 crc kubenswrapper[4765]: I0319 11:17:22.975654 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wn8c4" event={"ID":"e4bcb6d5-063a-4117-a6d5-5166c5c70b50","Type":"ContainerDied","Data":"5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572"} Mar 19 11:17:22 crc kubenswrapper[4765]: I0319 11:17:22.975945 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wn8c4" event={"ID":"e4bcb6d5-063a-4117-a6d5-5166c5c70b50","Type":"ContainerStarted","Data":"cad07ab4f3d6f7a1a5149781a3eadf447cc98e1d0894d49580ffee92d94cb823"} Mar 19 11:17:23 crc kubenswrapper[4765]: I0319 11:17:23.985439 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wn8c4" event={"ID":"e4bcb6d5-063a-4117-a6d5-5166c5c70b50","Type":"ContainerStarted","Data":"e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484"} Mar 19 11:17:24 crc kubenswrapper[4765]: I0319 11:17:24.995054 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerID="e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484" exitCode=0 Mar 19 11:17:24 crc kubenswrapper[4765]: I0319 11:17:24.995110 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wn8c4" event={"ID":"e4bcb6d5-063a-4117-a6d5-5166c5c70b50","Type":"ContainerDied","Data":"e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484"} Mar 19 11:17:25 crc kubenswrapper[4765]: I0319 11:17:25.299225 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:25 crc kubenswrapper[4765]: I0319 11:17:25.347486 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:26 crc kubenswrapper[4765]: I0319 11:17:26.007316 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wn8c4" event={"ID":"e4bcb6d5-063a-4117-a6d5-5166c5c70b50","Type":"ContainerStarted","Data":"edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465"} Mar 19 11:17:26 crc kubenswrapper[4765]: I0319 11:17:26.031728 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-wn8c4" podStartSLOduration=2.444818811 podStartE2EDuration="5.031711098s" podCreationTimestamp="2026-03-19 11:17:21 +0000 UTC" firstStartedPulling="2026-03-19 11:17:22.977241901 +0000 UTC m=+3341.326187443" lastFinishedPulling="2026-03-19 11:17:25.564134188 +0000 UTC m=+3343.913079730" observedRunningTime="2026-03-19 11:17:26.02364044 +0000 UTC m=+3344.372585982" watchObservedRunningTime="2026-03-19 11:17:26.031711098 +0000 UTC m=+3344.380656640" Mar 19 11:17:27 crc kubenswrapper[4765]: I0319 11:17:27.688306 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7cg5x"] Mar 19 11:17:27 crc kubenswrapper[4765]: I0319 11:17:27.688744 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-7cg5x" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="registry-server" containerID="cri-o://f13db9347125e4e9de80900bef1b3cab46d335f72c5d7b5fe1ac4b0ddd93559c" gracePeriod=2 Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.047683 4765 generic.go:334] "Generic (PLEG): container finished" podID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerID="f13db9347125e4e9de80900bef1b3cab46d335f72c5d7b5fe1ac4b0ddd93559c" exitCode=0 Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.047806 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cg5x" event={"ID":"a09270b5-4e86-4bc1-bd38-b5d72417e555","Type":"ContainerDied","Data":"f13db9347125e4e9de80900bef1b3cab46d335f72c5d7b5fe1ac4b0ddd93559c"} Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.242323 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.286856 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-utilities\") pod \"a09270b5-4e86-4bc1-bd38-b5d72417e555\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.286990 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-catalog-content\") pod \"a09270b5-4e86-4bc1-bd38-b5d72417e555\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.287128 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bw5hv\" (UniqueName: \"kubernetes.io/projected/a09270b5-4e86-4bc1-bd38-b5d72417e555-kube-api-access-bw5hv\") pod \"a09270b5-4e86-4bc1-bd38-b5d72417e555\" (UID: \"a09270b5-4e86-4bc1-bd38-b5d72417e555\") " Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.288464 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-utilities" (OuterVolumeSpecName: "utilities") pod "a09270b5-4e86-4bc1-bd38-b5d72417e555" (UID: "a09270b5-4e86-4bc1-bd38-b5d72417e555"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.298275 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a09270b5-4e86-4bc1-bd38-b5d72417e555-kube-api-access-bw5hv" (OuterVolumeSpecName: "kube-api-access-bw5hv") pod "a09270b5-4e86-4bc1-bd38-b5d72417e555" (UID: "a09270b5-4e86-4bc1-bd38-b5d72417e555"). InnerVolumeSpecName "kube-api-access-bw5hv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.348945 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a09270b5-4e86-4bc1-bd38-b5d72417e555" (UID: "a09270b5-4e86-4bc1-bd38-b5d72417e555"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.391273 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.391306 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bw5hv\" (UniqueName: \"kubernetes.io/projected/a09270b5-4e86-4bc1-bd38-b5d72417e555-kube-api-access-bw5hv\") on node \"crc\" DevicePath \"\"" Mar 19 11:17:28 crc kubenswrapper[4765]: I0319 11:17:28.391317 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a09270b5-4e86-4bc1-bd38-b5d72417e555-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:17:29 crc kubenswrapper[4765]: I0319 11:17:29.058541 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-7cg5x" event={"ID":"a09270b5-4e86-4bc1-bd38-b5d72417e555","Type":"ContainerDied","Data":"60e617e22426d08953a2b2d8a345a5079b8ab4c018b914f05f3ee0e22b41b8e6"} Mar 19 11:17:29 crc kubenswrapper[4765]: I0319 11:17:29.058853 4765 scope.go:117] "RemoveContainer" containerID="f13db9347125e4e9de80900bef1b3cab46d335f72c5d7b5fe1ac4b0ddd93559c" Mar 19 11:17:29 crc kubenswrapper[4765]: I0319 11:17:29.058643 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-7cg5x" Mar 19 11:17:29 crc kubenswrapper[4765]: I0319 11:17:29.082629 4765 scope.go:117] "RemoveContainer" containerID="b9074dffc3b1236de629aaf6b23cb8bac66c5a6833dba52aa4fd734b4878dba4" Mar 19 11:17:29 crc kubenswrapper[4765]: I0319 11:17:29.084878 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-7cg5x"] Mar 19 11:17:29 crc kubenswrapper[4765]: I0319 11:17:29.092651 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-7cg5x"] Mar 19 11:17:29 crc kubenswrapper[4765]: I0319 11:17:29.105990 4765 scope.go:117] "RemoveContainer" containerID="53e72870ea967452de246c8ce49d2fe8ae946e2bdce02cc41f2958cc0adc7293" Mar 19 11:17:30 crc kubenswrapper[4765]: I0319 11:17:30.365517 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" path="/var/lib/kubelet/pods/a09270b5-4e86-4bc1-bd38-b5d72417e555/volumes" Mar 19 11:17:31 crc kubenswrapper[4765]: I0319 11:17:31.632534 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:31 crc kubenswrapper[4765]: I0319 11:17:31.632810 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:31 crc kubenswrapper[4765]: I0319 11:17:31.690398 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:32 crc kubenswrapper[4765]: I0319 11:17:32.135680 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:32 crc kubenswrapper[4765]: I0319 11:17:32.361911 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:17:32 crc kubenswrapper[4765]: E0319 11:17:32.362190 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:17:32 crc kubenswrapper[4765]: I0319 11:17:32.890201 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wn8c4"] Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.101914 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-wn8c4" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="registry-server" containerID="cri-o://edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465" gracePeriod=2 Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.633241 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.706031 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qd6q7\" (UniqueName: \"kubernetes.io/projected/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-kube-api-access-qd6q7\") pod \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.706187 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-catalog-content\") pod \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.706332 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-utilities\") pod \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\" (UID: \"e4bcb6d5-063a-4117-a6d5-5166c5c70b50\") " Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.707702 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-utilities" (OuterVolumeSpecName: "utilities") pod "e4bcb6d5-063a-4117-a6d5-5166c5c70b50" (UID: "e4bcb6d5-063a-4117-a6d5-5166c5c70b50"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.724401 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-kube-api-access-qd6q7" (OuterVolumeSpecName: "kube-api-access-qd6q7") pod "e4bcb6d5-063a-4117-a6d5-5166c5c70b50" (UID: "e4bcb6d5-063a-4117-a6d5-5166c5c70b50"). InnerVolumeSpecName "kube-api-access-qd6q7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.773982 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4bcb6d5-063a-4117-a6d5-5166c5c70b50" (UID: "e4bcb6d5-063a-4117-a6d5-5166c5c70b50"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.808646 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.808678 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:17:34 crc kubenswrapper[4765]: I0319 11:17:34.808724 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qd6q7\" (UniqueName: \"kubernetes.io/projected/e4bcb6d5-063a-4117-a6d5-5166c5c70b50-kube-api-access-qd6q7\") on node \"crc\" DevicePath \"\"" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.111313 4765 generic.go:334] "Generic (PLEG): container finished" podID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerID="edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465" exitCode=0 Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.111364 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-wn8c4" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.111377 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wn8c4" event={"ID":"e4bcb6d5-063a-4117-a6d5-5166c5c70b50","Type":"ContainerDied","Data":"edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465"} Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.111585 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-wn8c4" event={"ID":"e4bcb6d5-063a-4117-a6d5-5166c5c70b50","Type":"ContainerDied","Data":"cad07ab4f3d6f7a1a5149781a3eadf447cc98e1d0894d49580ffee92d94cb823"} Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.111611 4765 scope.go:117] "RemoveContainer" containerID="edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.137674 4765 scope.go:117] "RemoveContainer" containerID="e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.150247 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-wn8c4"] Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.159332 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-wn8c4"] Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.178099 4765 scope.go:117] "RemoveContainer" containerID="5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.212444 4765 scope.go:117] "RemoveContainer" containerID="edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465" Mar 19 11:17:35 crc kubenswrapper[4765]: E0319 11:17:35.213076 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465\": container with ID starting with edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465 not found: ID does not exist" containerID="edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.213130 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465"} err="failed to get container status \"edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465\": rpc error: code = NotFound desc = could not find container \"edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465\": container with ID starting with edb088c32c7cf50919cb0388196ddadd41b221a65bcf50a8ad299ba17b012465 not found: ID does not exist" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.213154 4765 scope.go:117] "RemoveContainer" containerID="e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484" Mar 19 11:17:35 crc kubenswrapper[4765]: E0319 11:17:35.213534 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484\": container with ID starting with e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484 not found: ID does not exist" containerID="e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.213561 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484"} err="failed to get container status \"e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484\": rpc error: code = NotFound desc = could not find container \"e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484\": container with ID starting with e78dc34bf1b39fd21d09201fe6e774c42de6a9917bda07371351c87f89c0c484 not found: ID does not exist" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.213576 4765 scope.go:117] "RemoveContainer" containerID="5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572" Mar 19 11:17:35 crc kubenswrapper[4765]: E0319 11:17:35.213914 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572\": container with ID starting with 5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572 not found: ID does not exist" containerID="5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572" Mar 19 11:17:35 crc kubenswrapper[4765]: I0319 11:17:35.213936 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572"} err="failed to get container status \"5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572\": rpc error: code = NotFound desc = could not find container \"5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572\": container with ID starting with 5514eda27ff576a7528f831734f7d142390e544b0a952424d7f8c154f9ccd572 not found: ID does not exist" Mar 19 11:17:36 crc kubenswrapper[4765]: I0319 11:17:36.374985 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" path="/var/lib/kubelet/pods/e4bcb6d5-063a-4117-a6d5-5166c5c70b50/volumes" Mar 19 11:17:47 crc kubenswrapper[4765]: I0319 11:17:47.356340 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:17:47 crc kubenswrapper[4765]: E0319 11:17:47.357048 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.179889 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565318-68f87"] Mar 19 11:18:00 crc kubenswrapper[4765]: E0319 11:18:00.180785 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="registry-server" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.180810 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="registry-server" Mar 19 11:18:00 crc kubenswrapper[4765]: E0319 11:18:00.180829 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="registry-server" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.180835 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="registry-server" Mar 19 11:18:00 crc kubenswrapper[4765]: E0319 11:18:00.180861 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="extract-content" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.180868 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="extract-content" Mar 19 11:18:00 crc kubenswrapper[4765]: E0319 11:18:00.180876 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="extract-content" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.180883 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="extract-content" Mar 19 11:18:00 crc kubenswrapper[4765]: E0319 11:18:00.180900 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="extract-utilities" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.180906 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="extract-utilities" Mar 19 11:18:00 crc kubenswrapper[4765]: E0319 11:18:00.180921 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="extract-utilities" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.180926 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="extract-utilities" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.181169 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a09270b5-4e86-4bc1-bd38-b5d72417e555" containerName="registry-server" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.181187 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4bcb6d5-063a-4117-a6d5-5166c5c70b50" containerName="registry-server" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.181888 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565318-68f87" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.184467 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.185250 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.185792 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.189228 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565318-68f87"] Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.291144 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g44pt\" (UniqueName: \"kubernetes.io/projected/5f2a323c-cc66-49f2-b43d-6fc8bb970dd3-kube-api-access-g44pt\") pod \"auto-csr-approver-29565318-68f87\" (UID: \"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3\") " pod="openshift-infra/auto-csr-approver-29565318-68f87" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.393380 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g44pt\" (UniqueName: \"kubernetes.io/projected/5f2a323c-cc66-49f2-b43d-6fc8bb970dd3-kube-api-access-g44pt\") pod \"auto-csr-approver-29565318-68f87\" (UID: \"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3\") " pod="openshift-infra/auto-csr-approver-29565318-68f87" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.413385 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g44pt\" (UniqueName: \"kubernetes.io/projected/5f2a323c-cc66-49f2-b43d-6fc8bb970dd3-kube-api-access-g44pt\") pod \"auto-csr-approver-29565318-68f87\" (UID: \"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3\") " pod="openshift-infra/auto-csr-approver-29565318-68f87" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.513026 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565318-68f87" Mar 19 11:18:00 crc kubenswrapper[4765]: I0319 11:18:00.947999 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565318-68f87"] Mar 19 11:18:01 crc kubenswrapper[4765]: I0319 11:18:01.359087 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565318-68f87" event={"ID":"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3","Type":"ContainerStarted","Data":"d8ecfc270f83eff4ace82da39489fad0c5af3f576e2765a0b4d2a0bf5801e347"} Mar 19 11:18:02 crc kubenswrapper[4765]: I0319 11:18:02.363172 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:18:02 crc kubenswrapper[4765]: E0319 11:18:02.363883 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:18:02 crc kubenswrapper[4765]: I0319 11:18:02.368582 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565318-68f87" event={"ID":"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3","Type":"ContainerStarted","Data":"6fc5a138083539101dd6ef835737b7db82f036582c816fe1daae206a9f55f116"} Mar 19 11:18:02 crc kubenswrapper[4765]: I0319 11:18:02.403659 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565318-68f87" podStartSLOduration=1.284711148 podStartE2EDuration="2.403634046s" podCreationTimestamp="2026-03-19 11:18:00 +0000 UTC" firstStartedPulling="2026-03-19 11:18:00.952213444 +0000 UTC m=+3379.301159016" lastFinishedPulling="2026-03-19 11:18:02.071136372 +0000 UTC m=+3380.420081914" observedRunningTime="2026-03-19 11:18:02.39710831 +0000 UTC m=+3380.746053852" watchObservedRunningTime="2026-03-19 11:18:02.403634046 +0000 UTC m=+3380.752579588" Mar 19 11:18:03 crc kubenswrapper[4765]: I0319 11:18:03.380862 4765 generic.go:334] "Generic (PLEG): container finished" podID="5f2a323c-cc66-49f2-b43d-6fc8bb970dd3" containerID="6fc5a138083539101dd6ef835737b7db82f036582c816fe1daae206a9f55f116" exitCode=0 Mar 19 11:18:03 crc kubenswrapper[4765]: I0319 11:18:03.380928 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565318-68f87" event={"ID":"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3","Type":"ContainerDied","Data":"6fc5a138083539101dd6ef835737b7db82f036582c816fe1daae206a9f55f116"} Mar 19 11:18:04 crc kubenswrapper[4765]: I0319 11:18:04.769585 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565318-68f87" Mar 19 11:18:04 crc kubenswrapper[4765]: I0319 11:18:04.891346 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g44pt\" (UniqueName: \"kubernetes.io/projected/5f2a323c-cc66-49f2-b43d-6fc8bb970dd3-kube-api-access-g44pt\") pod \"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3\" (UID: \"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3\") " Mar 19 11:18:04 crc kubenswrapper[4765]: I0319 11:18:04.898702 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f2a323c-cc66-49f2-b43d-6fc8bb970dd3-kube-api-access-g44pt" (OuterVolumeSpecName: "kube-api-access-g44pt") pod "5f2a323c-cc66-49f2-b43d-6fc8bb970dd3" (UID: "5f2a323c-cc66-49f2-b43d-6fc8bb970dd3"). InnerVolumeSpecName "kube-api-access-g44pt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:18:04 crc kubenswrapper[4765]: I0319 11:18:04.994396 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g44pt\" (UniqueName: \"kubernetes.io/projected/5f2a323c-cc66-49f2-b43d-6fc8bb970dd3-kube-api-access-g44pt\") on node \"crc\" DevicePath \"\"" Mar 19 11:18:05 crc kubenswrapper[4765]: I0319 11:18:05.400917 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565318-68f87" event={"ID":"5f2a323c-cc66-49f2-b43d-6fc8bb970dd3","Type":"ContainerDied","Data":"d8ecfc270f83eff4ace82da39489fad0c5af3f576e2765a0b4d2a0bf5801e347"} Mar 19 11:18:05 crc kubenswrapper[4765]: I0319 11:18:05.400997 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d8ecfc270f83eff4ace82da39489fad0c5af3f576e2765a0b4d2a0bf5801e347" Mar 19 11:18:05 crc kubenswrapper[4765]: I0319 11:18:05.401281 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565318-68f87" Mar 19 11:18:05 crc kubenswrapper[4765]: I0319 11:18:05.457244 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565312-zb8d9"] Mar 19 11:18:05 crc kubenswrapper[4765]: I0319 11:18:05.465146 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565312-zb8d9"] Mar 19 11:18:06 crc kubenswrapper[4765]: I0319 11:18:06.366476 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c751965-eb03-4655-9abc-917dc9b5aeb1" path="/var/lib/kubelet/pods/2c751965-eb03-4655-9abc-917dc9b5aeb1/volumes" Mar 19 11:18:13 crc kubenswrapper[4765]: I0319 11:18:13.357120 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:18:13 crc kubenswrapper[4765]: E0319 11:18:13.357903 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:18:15 crc kubenswrapper[4765]: I0319 11:18:15.753288 4765 scope.go:117] "RemoveContainer" containerID="b1dab3a48fe59b251ff95c56340004eeffdec8bd31fe69fee86cdecc7c716676" Mar 19 11:18:28 crc kubenswrapper[4765]: I0319 11:18:28.356114 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:18:28 crc kubenswrapper[4765]: E0319 11:18:28.358037 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:18:41 crc kubenswrapper[4765]: I0319 11:18:41.356764 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:18:41 crc kubenswrapper[4765]: E0319 11:18:41.357527 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.672882 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-p6vn5"] Mar 19 11:18:45 crc kubenswrapper[4765]: E0319 11:18:45.677293 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f2a323c-cc66-49f2-b43d-6fc8bb970dd3" containerName="oc" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.677386 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f2a323c-cc66-49f2-b43d-6fc8bb970dd3" containerName="oc" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.680217 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f2a323c-cc66-49f2-b43d-6fc8bb970dd3" containerName="oc" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.683353 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.691400 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6vn5"] Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.740260 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-utilities\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.740352 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-catalog-content\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.740392 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wpcfx\" (UniqueName: \"kubernetes.io/projected/783d095e-ef47-4b48-b93b-262f86ca1d43-kube-api-access-wpcfx\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.842688 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-utilities\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.842770 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-catalog-content\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.842798 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wpcfx\" (UniqueName: \"kubernetes.io/projected/783d095e-ef47-4b48-b93b-262f86ca1d43-kube-api-access-wpcfx\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.843346 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-utilities\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.843604 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-catalog-content\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:45 crc kubenswrapper[4765]: I0319 11:18:45.872885 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wpcfx\" (UniqueName: \"kubernetes.io/projected/783d095e-ef47-4b48-b93b-262f86ca1d43-kube-api-access-wpcfx\") pod \"redhat-operators-p6vn5\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:46 crc kubenswrapper[4765]: I0319 11:18:46.007315 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:46 crc kubenswrapper[4765]: I0319 11:18:46.459863 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-p6vn5"] Mar 19 11:18:46 crc kubenswrapper[4765]: I0319 11:18:46.776878 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6vn5" event={"ID":"783d095e-ef47-4b48-b93b-262f86ca1d43","Type":"ContainerStarted","Data":"4810af6b6005dcd4de91ddb389be0a85a4df35feb326c3f0ea7fa7eda4a296cf"} Mar 19 11:18:47 crc kubenswrapper[4765]: I0319 11:18:47.787507 4765 generic.go:334] "Generic (PLEG): container finished" podID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerID="eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e" exitCode=0 Mar 19 11:18:47 crc kubenswrapper[4765]: I0319 11:18:47.787557 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6vn5" event={"ID":"783d095e-ef47-4b48-b93b-262f86ca1d43","Type":"ContainerDied","Data":"eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e"} Mar 19 11:18:49 crc kubenswrapper[4765]: I0319 11:18:49.806513 4765 generic.go:334] "Generic (PLEG): container finished" podID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerID="03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5" exitCode=0 Mar 19 11:18:49 crc kubenswrapper[4765]: I0319 11:18:49.806582 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6vn5" event={"ID":"783d095e-ef47-4b48-b93b-262f86ca1d43","Type":"ContainerDied","Data":"03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5"} Mar 19 11:18:50 crc kubenswrapper[4765]: I0319 11:18:50.832705 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6vn5" event={"ID":"783d095e-ef47-4b48-b93b-262f86ca1d43","Type":"ContainerStarted","Data":"e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01"} Mar 19 11:18:50 crc kubenswrapper[4765]: I0319 11:18:50.859207 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-p6vn5" podStartSLOduration=3.358312529 podStartE2EDuration="5.859182942s" podCreationTimestamp="2026-03-19 11:18:45 +0000 UTC" firstStartedPulling="2026-03-19 11:18:47.789220803 +0000 UTC m=+3426.138166345" lastFinishedPulling="2026-03-19 11:18:50.290091216 +0000 UTC m=+3428.639036758" observedRunningTime="2026-03-19 11:18:50.853633232 +0000 UTC m=+3429.202578774" watchObservedRunningTime="2026-03-19 11:18:50.859182942 +0000 UTC m=+3429.208128484" Mar 19 11:18:52 crc kubenswrapper[4765]: I0319 11:18:52.363484 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:18:52 crc kubenswrapper[4765]: E0319 11:18:52.364018 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:18:56 crc kubenswrapper[4765]: I0319 11:18:56.008059 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:56 crc kubenswrapper[4765]: I0319 11:18:56.008580 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:18:57 crc kubenswrapper[4765]: I0319 11:18:57.059068 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-p6vn5" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="registry-server" probeResult="failure" output=< Mar 19 11:18:57 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 11:18:57 crc kubenswrapper[4765]: > Mar 19 11:19:06 crc kubenswrapper[4765]: I0319 11:19:06.059648 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:19:06 crc kubenswrapper[4765]: I0319 11:19:06.108677 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:19:06 crc kubenswrapper[4765]: I0319 11:19:06.300911 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6vn5"] Mar 19 11:19:07 crc kubenswrapper[4765]: I0319 11:19:07.356763 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:19:07 crc kubenswrapper[4765]: I0319 11:19:07.981486 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"9f428a0136b444a0c21f9c6c085da235b98f398bb53eda25b0fa7e3ce28d5318"} Mar 19 11:19:07 crc kubenswrapper[4765]: I0319 11:19:07.981583 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-p6vn5" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="registry-server" containerID="cri-o://e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01" gracePeriod=2 Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.561365 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.691418 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-utilities\") pod \"783d095e-ef47-4b48-b93b-262f86ca1d43\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.691873 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wpcfx\" (UniqueName: \"kubernetes.io/projected/783d095e-ef47-4b48-b93b-262f86ca1d43-kube-api-access-wpcfx\") pod \"783d095e-ef47-4b48-b93b-262f86ca1d43\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.691933 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-catalog-content\") pod \"783d095e-ef47-4b48-b93b-262f86ca1d43\" (UID: \"783d095e-ef47-4b48-b93b-262f86ca1d43\") " Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.692228 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-utilities" (OuterVolumeSpecName: "utilities") pod "783d095e-ef47-4b48-b93b-262f86ca1d43" (UID: "783d095e-ef47-4b48-b93b-262f86ca1d43"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.692571 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.697619 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/783d095e-ef47-4b48-b93b-262f86ca1d43-kube-api-access-wpcfx" (OuterVolumeSpecName: "kube-api-access-wpcfx") pod "783d095e-ef47-4b48-b93b-262f86ca1d43" (UID: "783d095e-ef47-4b48-b93b-262f86ca1d43"). InnerVolumeSpecName "kube-api-access-wpcfx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.795063 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wpcfx\" (UniqueName: \"kubernetes.io/projected/783d095e-ef47-4b48-b93b-262f86ca1d43-kube-api-access-wpcfx\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.843998 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "783d095e-ef47-4b48-b93b-262f86ca1d43" (UID: "783d095e-ef47-4b48-b93b-262f86ca1d43"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.896791 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/783d095e-ef47-4b48-b93b-262f86ca1d43-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.992651 4765 generic.go:334] "Generic (PLEG): container finished" podID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerID="e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01" exitCode=0 Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.992696 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6vn5" event={"ID":"783d095e-ef47-4b48-b93b-262f86ca1d43","Type":"ContainerDied","Data":"e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01"} Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.992725 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-p6vn5" event={"ID":"783d095e-ef47-4b48-b93b-262f86ca1d43","Type":"ContainerDied","Data":"4810af6b6005dcd4de91ddb389be0a85a4df35feb326c3f0ea7fa7eda4a296cf"} Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.992743 4765 scope.go:117] "RemoveContainer" containerID="e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01" Mar 19 11:19:08 crc kubenswrapper[4765]: I0319 11:19:08.992865 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-p6vn5" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.037779 4765 scope.go:117] "RemoveContainer" containerID="03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.041884 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-p6vn5"] Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.051556 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-p6vn5"] Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.059307 4765 scope.go:117] "RemoveContainer" containerID="eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.104560 4765 scope.go:117] "RemoveContainer" containerID="e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01" Mar 19 11:19:09 crc kubenswrapper[4765]: E0319 11:19:09.105385 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01\": container with ID starting with e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01 not found: ID does not exist" containerID="e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.105428 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01"} err="failed to get container status \"e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01\": rpc error: code = NotFound desc = could not find container \"e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01\": container with ID starting with e17997ae28293b9e36e2604e3a4dc239ea50a2a241326a2f4ea01270ca86bf01 not found: ID does not exist" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.105484 4765 scope.go:117] "RemoveContainer" containerID="03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5" Mar 19 11:19:09 crc kubenswrapper[4765]: E0319 11:19:09.105813 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5\": container with ID starting with 03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5 not found: ID does not exist" containerID="03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.105849 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5"} err="failed to get container status \"03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5\": rpc error: code = NotFound desc = could not find container \"03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5\": container with ID starting with 03d08674a2b02253cde8a41cdf09b6228f92ab1a20d00fc165f620f8b94461b5 not found: ID does not exist" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.105892 4765 scope.go:117] "RemoveContainer" containerID="eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e" Mar 19 11:19:09 crc kubenswrapper[4765]: E0319 11:19:09.106147 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e\": container with ID starting with eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e not found: ID does not exist" containerID="eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e" Mar 19 11:19:09 crc kubenswrapper[4765]: I0319 11:19:09.106177 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e"} err="failed to get container status \"eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e\": rpc error: code = NotFound desc = could not find container \"eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e\": container with ID starting with eae844cf98505f2aacca60f7675cb4c625031fa210cef54f8824cd9896d71a4e not found: ID does not exist" Mar 19 11:19:10 crc kubenswrapper[4765]: I0319 11:19:10.004292 4765 generic.go:334] "Generic (PLEG): container finished" podID="65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" containerID="1333137a4a2ef8a5895accf9377cfee0342716cde7a94be61c92b5f6d8908736" exitCode=0 Mar 19 11:19:10 crc kubenswrapper[4765]: I0319 11:19:10.004397 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66","Type":"ContainerDied","Data":"1333137a4a2ef8a5895accf9377cfee0342716cde7a94be61c92b5f6d8908736"} Mar 19 11:19:10 crc kubenswrapper[4765]: I0319 11:19:10.367604 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" path="/var/lib/kubelet/pods/783d095e-ef47-4b48-b93b-262f86ca1d43/volumes" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.399471 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461563 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ca-certs\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461650 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config-secret\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461681 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-kube-api-access-4m2wf\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461782 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-temporary\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461817 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-workdir\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461842 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-config-data\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461870 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461918 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ssh-key\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.461941 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"test-operator-logs\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\" (UID: \"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66\") " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.463535 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-config-data" (OuterVolumeSpecName: "config-data") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.464181 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-temporary" (OuterVolumeSpecName: "test-operator-ephemeral-temporary") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "test-operator-ephemeral-temporary". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.466763 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-workdir" (OuterVolumeSpecName: "test-operator-ephemeral-workdir") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "test-operator-ephemeral-workdir". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.468745 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "test-operator-logs") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.469068 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-kube-api-access-4m2wf" (OuterVolumeSpecName: "kube-api-access-4m2wf") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "kube-api-access-4m2wf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.496261 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.496798 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ssh-key" (OuterVolumeSpecName: "ssh-key") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "ssh-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.509361 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ca-certs" (OuterVolumeSpecName: "ca-certs") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "ca-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.518838 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" (UID: "65eabf0c-0a01-4d5b-aefd-d9ce064e1d66"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570730 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570768 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4m2wf\" (UniqueName: \"kubernetes.io/projected/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-kube-api-access-4m2wf\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570783 4765 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-temporary\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-temporary\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570799 4765 reconciler_common.go:293] "Volume detached for volume \"test-operator-ephemeral-workdir\" (UniqueName: \"kubernetes.io/empty-dir/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-test-operator-ephemeral-workdir\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570814 4765 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-config-data\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570826 4765 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-openstack-config\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570838 4765 reconciler_common.go:293] "Volume detached for volume \"ssh-key\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ssh-key\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570877 4765 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.570891 4765 reconciler_common.go:293] "Volume detached for volume \"ca-certs\" (UniqueName: \"kubernetes.io/secret/65eabf0c-0a01-4d5b-aefd-d9ce064e1d66-ca-certs\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.597381 4765 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Mar 19 11:19:11 crc kubenswrapper[4765]: I0319 11:19:11.674671 4765 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Mar 19 11:19:12 crc kubenswrapper[4765]: I0319 11:19:12.033581 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/tempest-tests-tempest" event={"ID":"65eabf0c-0a01-4d5b-aefd-d9ce064e1d66","Type":"ContainerDied","Data":"90a9f228b3cf02fbbd0ef331c015a214abf2cf7294e39b3d1b092f07f17a9706"} Mar 19 11:19:12 crc kubenswrapper[4765]: I0319 11:19:12.033648 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="90a9f228b3cf02fbbd0ef331c015a214abf2cf7294e39b3d1b092f07f17a9706" Mar 19 11:19:12 crc kubenswrapper[4765]: I0319 11:19:12.033754 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.652130 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 11:19:13 crc kubenswrapper[4765]: E0319 11:19:13.652934 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="extract-utilities" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.652950 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="extract-utilities" Mar 19 11:19:13 crc kubenswrapper[4765]: E0319 11:19:13.652984 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" containerName="tempest-tests-tempest-tests-runner" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.652991 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" containerName="tempest-tests-tempest-tests-runner" Mar 19 11:19:13 crc kubenswrapper[4765]: E0319 11:19:13.653001 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="extract-content" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.653008 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="extract-content" Mar 19 11:19:13 crc kubenswrapper[4765]: E0319 11:19:13.653033 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="registry-server" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.653039 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="registry-server" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.653245 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="783d095e-ef47-4b48-b93b-262f86ca1d43" containerName="registry-server" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.653259 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="65eabf0c-0a01-4d5b-aefd-d9ce064e1d66" containerName="tempest-tests-tempest-tests-runner" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.653871 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.656593 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"default-dockercfg-whrlw" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.665726 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.710054 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zkq65\" (UniqueName: \"kubernetes.io/projected/9bf5c002-a318-47f6-8ba0-5a39c88daeff-kube-api-access-zkq65\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9bf5c002-a318-47f6-8ba0-5a39c88daeff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.710210 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9bf5c002-a318-47f6-8ba0-5a39c88daeff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.812124 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9bf5c002-a318-47f6-8ba0-5a39c88daeff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.812243 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zkq65\" (UniqueName: \"kubernetes.io/projected/9bf5c002-a318-47f6-8ba0-5a39c88daeff-kube-api-access-zkq65\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9bf5c002-a318-47f6-8ba0-5a39c88daeff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.812626 4765 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9bf5c002-a318-47f6-8ba0-5a39c88daeff\") device mount path \"/mnt/openstack/pv10\"" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.833672 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zkq65\" (UniqueName: \"kubernetes.io/projected/9bf5c002-a318-47f6-8ba0-5a39c88daeff-kube-api-access-zkq65\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9bf5c002-a318-47f6-8ba0-5a39c88daeff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.843870 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"test-operator-logs-pod-tempest-tempest-tests-tempest\" (UID: \"9bf5c002-a318-47f6-8ba0-5a39c88daeff\") " pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:13 crc kubenswrapper[4765]: I0319 11:19:13.974565 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" Mar 19 11:19:14 crc kubenswrapper[4765]: I0319 11:19:14.447281 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:19:14 crc kubenswrapper[4765]: I0319 11:19:14.448266 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/test-operator-logs-pod-tempest-tempest-tests-tempest"] Mar 19 11:19:15 crc kubenswrapper[4765]: I0319 11:19:15.058658 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9bf5c002-a318-47f6-8ba0-5a39c88daeff","Type":"ContainerStarted","Data":"6082b59094de485e8be22339c3297704f4c60f7644d2231e9d15c2f77cfa27ff"} Mar 19 11:19:16 crc kubenswrapper[4765]: I0319 11:19:16.072472 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" event={"ID":"9bf5c002-a318-47f6-8ba0-5a39c88daeff","Type":"ContainerStarted","Data":"39edb03cf0084e338297772092e38e62f63d4b5ef7077c049c18a8c727b8f010"} Mar 19 11:19:16 crc kubenswrapper[4765]: I0319 11:19:16.089375 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/test-operator-logs-pod-tempest-tempest-tests-tempest" podStartSLOduration=2.277761448 podStartE2EDuration="3.089350424s" podCreationTimestamp="2026-03-19 11:19:13 +0000 UTC" firstStartedPulling="2026-03-19 11:19:14.446991415 +0000 UTC m=+3452.795936957" lastFinishedPulling="2026-03-19 11:19:15.258580391 +0000 UTC m=+3453.607525933" observedRunningTime="2026-03-19 11:19:16.088858741 +0000 UTC m=+3454.437804283" watchObservedRunningTime="2026-03-19 11:19:16.089350424 +0000 UTC m=+3454.438295976" Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.767520 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vk8z8/must-gather-bn4w7"] Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.778077 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.791060 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vk8z8"/"kube-root-ca.crt" Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.800430 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-vk8z8"/"default-dockercfg-lp2hv" Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.800514 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-vk8z8"/"openshift-service-ca.crt" Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.890670 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vk8z8/must-gather-bn4w7"] Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.948067 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8jrv\" (UniqueName: \"kubernetes.io/projected/db728e4f-3565-48a8-a6ad-7b725e672b93-kube-api-access-r8jrv\") pod \"must-gather-bn4w7\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:40 crc kubenswrapper[4765]: I0319 11:19:40.948242 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db728e4f-3565-48a8-a6ad-7b725e672b93-must-gather-output\") pod \"must-gather-bn4w7\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:41 crc kubenswrapper[4765]: I0319 11:19:41.050584 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r8jrv\" (UniqueName: \"kubernetes.io/projected/db728e4f-3565-48a8-a6ad-7b725e672b93-kube-api-access-r8jrv\") pod \"must-gather-bn4w7\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:41 crc kubenswrapper[4765]: I0319 11:19:41.051106 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db728e4f-3565-48a8-a6ad-7b725e672b93-must-gather-output\") pod \"must-gather-bn4w7\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:41 crc kubenswrapper[4765]: I0319 11:19:41.051586 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db728e4f-3565-48a8-a6ad-7b725e672b93-must-gather-output\") pod \"must-gather-bn4w7\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:41 crc kubenswrapper[4765]: I0319 11:19:41.075936 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8jrv\" (UniqueName: \"kubernetes.io/projected/db728e4f-3565-48a8-a6ad-7b725e672b93-kube-api-access-r8jrv\") pod \"must-gather-bn4w7\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:41 crc kubenswrapper[4765]: I0319 11:19:41.137894 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:19:41 crc kubenswrapper[4765]: I0319 11:19:41.661604 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-vk8z8/must-gather-bn4w7"] Mar 19 11:19:42 crc kubenswrapper[4765]: I0319 11:19:42.315567 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" event={"ID":"db728e4f-3565-48a8-a6ad-7b725e672b93","Type":"ContainerStarted","Data":"21557ea7e374a9d496ad58693748634b4646ba631bbb8e2ab42324d781e52de9"} Mar 19 11:19:48 crc kubenswrapper[4765]: I0319 11:19:48.419869 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" event={"ID":"db728e4f-3565-48a8-a6ad-7b725e672b93","Type":"ContainerStarted","Data":"e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b"} Mar 19 11:19:48 crc kubenswrapper[4765]: I0319 11:19:48.420488 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" event={"ID":"db728e4f-3565-48a8-a6ad-7b725e672b93","Type":"ContainerStarted","Data":"5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a"} Mar 19 11:19:48 crc kubenswrapper[4765]: I0319 11:19:48.444155 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" podStartSLOduration=2.4281461 podStartE2EDuration="8.444132283s" podCreationTimestamp="2026-03-19 11:19:40 +0000 UTC" firstStartedPulling="2026-03-19 11:19:41.669921722 +0000 UTC m=+3480.018867264" lastFinishedPulling="2026-03-19 11:19:47.685907885 +0000 UTC m=+3486.034853447" observedRunningTime="2026-03-19 11:19:48.438636105 +0000 UTC m=+3486.787581667" watchObservedRunningTime="2026-03-19 11:19:48.444132283 +0000 UTC m=+3486.793077825" Mar 19 11:19:51 crc kubenswrapper[4765]: I0319 11:19:51.719364 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-8xgbn"] Mar 19 11:19:51 crc kubenswrapper[4765]: I0319 11:19:51.721695 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:51 crc kubenswrapper[4765]: I0319 11:19:51.901433 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/42889d55-04a4-4f78-b2fc-a88c0bc79caf-host\") pod \"crc-debug-8xgbn\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:51 crc kubenswrapper[4765]: I0319 11:19:51.901529 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flhrb\" (UniqueName: \"kubernetes.io/projected/42889d55-04a4-4f78-b2fc-a88c0bc79caf-kube-api-access-flhrb\") pod \"crc-debug-8xgbn\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:52 crc kubenswrapper[4765]: I0319 11:19:52.003055 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/42889d55-04a4-4f78-b2fc-a88c0bc79caf-host\") pod \"crc-debug-8xgbn\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:52 crc kubenswrapper[4765]: I0319 11:19:52.003152 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-flhrb\" (UniqueName: \"kubernetes.io/projected/42889d55-04a4-4f78-b2fc-a88c0bc79caf-kube-api-access-flhrb\") pod \"crc-debug-8xgbn\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:52 crc kubenswrapper[4765]: I0319 11:19:52.003510 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/42889d55-04a4-4f78-b2fc-a88c0bc79caf-host\") pod \"crc-debug-8xgbn\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:52 crc kubenswrapper[4765]: I0319 11:19:52.027389 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-flhrb\" (UniqueName: \"kubernetes.io/projected/42889d55-04a4-4f78-b2fc-a88c0bc79caf-kube-api-access-flhrb\") pod \"crc-debug-8xgbn\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:52 crc kubenswrapper[4765]: I0319 11:19:52.049533 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:19:52 crc kubenswrapper[4765]: W0319 11:19:52.095779 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod42889d55_04a4_4f78_b2fc_a88c0bc79caf.slice/crio-ab305f379527c735789ec8c245a8b495217439a0822b3950d9db8727d0626b20 WatchSource:0}: Error finding container ab305f379527c735789ec8c245a8b495217439a0822b3950d9db8727d0626b20: Status 404 returned error can't find the container with id ab305f379527c735789ec8c245a8b495217439a0822b3950d9db8727d0626b20 Mar 19 11:19:52 crc kubenswrapper[4765]: I0319 11:19:52.453892 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" event={"ID":"42889d55-04a4-4f78-b2fc-a88c0bc79caf","Type":"ContainerStarted","Data":"ab305f379527c735789ec8c245a8b495217439a0822b3950d9db8727d0626b20"} Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.150867 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565320-hcmhc"] Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.152937 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565320-hcmhc" Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.160593 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.160875 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.161085 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.172831 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565320-hcmhc"] Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.287290 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6nt5\" (UniqueName: \"kubernetes.io/projected/c1d1a971-9704-4789-a6b2-c6250bab2e4e-kube-api-access-x6nt5\") pod \"auto-csr-approver-29565320-hcmhc\" (UID: \"c1d1a971-9704-4789-a6b2-c6250bab2e4e\") " pod="openshift-infra/auto-csr-approver-29565320-hcmhc" Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.390819 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6nt5\" (UniqueName: \"kubernetes.io/projected/c1d1a971-9704-4789-a6b2-c6250bab2e4e-kube-api-access-x6nt5\") pod \"auto-csr-approver-29565320-hcmhc\" (UID: \"c1d1a971-9704-4789-a6b2-c6250bab2e4e\") " pod="openshift-infra/auto-csr-approver-29565320-hcmhc" Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.433563 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6nt5\" (UniqueName: \"kubernetes.io/projected/c1d1a971-9704-4789-a6b2-c6250bab2e4e-kube-api-access-x6nt5\") pod \"auto-csr-approver-29565320-hcmhc\" (UID: \"c1d1a971-9704-4789-a6b2-c6250bab2e4e\") " pod="openshift-infra/auto-csr-approver-29565320-hcmhc" Mar 19 11:20:00 crc kubenswrapper[4765]: I0319 11:20:00.484633 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565320-hcmhc" Mar 19 11:20:04 crc kubenswrapper[4765]: I0319 11:20:04.570164 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565320-hcmhc"] Mar 19 11:20:04 crc kubenswrapper[4765]: W0319 11:20:04.578214 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc1d1a971_9704_4789_a6b2_c6250bab2e4e.slice/crio-3c951f5c5f96db8ad4f9fc5a748f3cc184c1186e5291d641b4f1c2c58f24aab6 WatchSource:0}: Error finding container 3c951f5c5f96db8ad4f9fc5a748f3cc184c1186e5291d641b4f1c2c58f24aab6: Status 404 returned error can't find the container with id 3c951f5c5f96db8ad4f9fc5a748f3cc184c1186e5291d641b4f1c2c58f24aab6 Mar 19 11:20:04 crc kubenswrapper[4765]: I0319 11:20:04.583232 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" event={"ID":"42889d55-04a4-4f78-b2fc-a88c0bc79caf","Type":"ContainerStarted","Data":"c796cc740e296257b5817e48a670c1c95852e07a13ac844140866f450fb207af"} Mar 19 11:20:04 crc kubenswrapper[4765]: I0319 11:20:04.607302 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" podStartSLOduration=1.491425998 podStartE2EDuration="13.60727961s" podCreationTimestamp="2026-03-19 11:19:51 +0000 UTC" firstStartedPulling="2026-03-19 11:19:52.098415154 +0000 UTC m=+3490.447360696" lastFinishedPulling="2026-03-19 11:20:04.214268766 +0000 UTC m=+3502.563214308" observedRunningTime="2026-03-19 11:20:04.604257159 +0000 UTC m=+3502.953202701" watchObservedRunningTime="2026-03-19 11:20:04.60727961 +0000 UTC m=+3502.956225152" Mar 19 11:20:05 crc kubenswrapper[4765]: I0319 11:20:05.596980 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565320-hcmhc" event={"ID":"c1d1a971-9704-4789-a6b2-c6250bab2e4e","Type":"ContainerStarted","Data":"3c951f5c5f96db8ad4f9fc5a748f3cc184c1186e5291d641b4f1c2c58f24aab6"} Mar 19 11:20:06 crc kubenswrapper[4765]: I0319 11:20:06.611496 4765 generic.go:334] "Generic (PLEG): container finished" podID="c1d1a971-9704-4789-a6b2-c6250bab2e4e" containerID="aa84a26389328563efc4b462881d3f2f6283d9512bcf29f60617d26c9ebc4eaf" exitCode=0 Mar 19 11:20:06 crc kubenswrapper[4765]: I0319 11:20:06.612444 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565320-hcmhc" event={"ID":"c1d1a971-9704-4789-a6b2-c6250bab2e4e","Type":"ContainerDied","Data":"aa84a26389328563efc4b462881d3f2f6283d9512bcf29f60617d26c9ebc4eaf"} Mar 19 11:20:08 crc kubenswrapper[4765]: I0319 11:20:08.005139 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565320-hcmhc" Mar 19 11:20:08 crc kubenswrapper[4765]: I0319 11:20:08.151600 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6nt5\" (UniqueName: \"kubernetes.io/projected/c1d1a971-9704-4789-a6b2-c6250bab2e4e-kube-api-access-x6nt5\") pod \"c1d1a971-9704-4789-a6b2-c6250bab2e4e\" (UID: \"c1d1a971-9704-4789-a6b2-c6250bab2e4e\") " Mar 19 11:20:08 crc kubenswrapper[4765]: I0319 11:20:08.159816 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c1d1a971-9704-4789-a6b2-c6250bab2e4e-kube-api-access-x6nt5" (OuterVolumeSpecName: "kube-api-access-x6nt5") pod "c1d1a971-9704-4789-a6b2-c6250bab2e4e" (UID: "c1d1a971-9704-4789-a6b2-c6250bab2e4e"). InnerVolumeSpecName "kube-api-access-x6nt5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:20:08 crc kubenswrapper[4765]: I0319 11:20:08.254150 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6nt5\" (UniqueName: \"kubernetes.io/projected/c1d1a971-9704-4789-a6b2-c6250bab2e4e-kube-api-access-x6nt5\") on node \"crc\" DevicePath \"\"" Mar 19 11:20:08 crc kubenswrapper[4765]: I0319 11:20:08.636004 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565320-hcmhc" event={"ID":"c1d1a971-9704-4789-a6b2-c6250bab2e4e","Type":"ContainerDied","Data":"3c951f5c5f96db8ad4f9fc5a748f3cc184c1186e5291d641b4f1c2c58f24aab6"} Mar 19 11:20:08 crc kubenswrapper[4765]: I0319 11:20:08.636556 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3c951f5c5f96db8ad4f9fc5a748f3cc184c1186e5291d641b4f1c2c58f24aab6" Mar 19 11:20:08 crc kubenswrapper[4765]: I0319 11:20:08.636051 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565320-hcmhc" Mar 19 11:20:09 crc kubenswrapper[4765]: I0319 11:20:09.080783 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565314-f4dpf"] Mar 19 11:20:09 crc kubenswrapper[4765]: I0319 11:20:09.091484 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565314-f4dpf"] Mar 19 11:20:10 crc kubenswrapper[4765]: I0319 11:20:10.367083 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1135c796-c575-429b-a789-93d9f8d093f3" path="/var/lib/kubelet/pods/1135c796-c575-429b-a789-93d9f8d093f3/volumes" Mar 19 11:20:15 crc kubenswrapper[4765]: I0319 11:20:15.896253 4765 scope.go:117] "RemoveContainer" containerID="b5243380a142956123d3cd0bcaac9cdd8e40ce6dbb344bd110ab0e9909fd0a89" Mar 19 11:20:44 crc kubenswrapper[4765]: I0319 11:20:44.985094 4765 generic.go:334] "Generic (PLEG): container finished" podID="42889d55-04a4-4f78-b2fc-a88c0bc79caf" containerID="c796cc740e296257b5817e48a670c1c95852e07a13ac844140866f450fb207af" exitCode=0 Mar 19 11:20:44 crc kubenswrapper[4765]: I0319 11:20:44.985226 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" event={"ID":"42889d55-04a4-4f78-b2fc-a88c0bc79caf","Type":"ContainerDied","Data":"c796cc740e296257b5817e48a670c1c95852e07a13ac844140866f450fb207af"} Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.105444 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.151018 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-8xgbn"] Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.160497 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-8xgbn"] Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.272567 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flhrb\" (UniqueName: \"kubernetes.io/projected/42889d55-04a4-4f78-b2fc-a88c0bc79caf-kube-api-access-flhrb\") pod \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.272804 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/42889d55-04a4-4f78-b2fc-a88c0bc79caf-host\") pod \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\" (UID: \"42889d55-04a4-4f78-b2fc-a88c0bc79caf\") " Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.272929 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/42889d55-04a4-4f78-b2fc-a88c0bc79caf-host" (OuterVolumeSpecName: "host") pod "42889d55-04a4-4f78-b2fc-a88c0bc79caf" (UID: "42889d55-04a4-4f78-b2fc-a88c0bc79caf"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.273246 4765 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/42889d55-04a4-4f78-b2fc-a88c0bc79caf-host\") on node \"crc\" DevicePath \"\"" Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.278734 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/42889d55-04a4-4f78-b2fc-a88c0bc79caf-kube-api-access-flhrb" (OuterVolumeSpecName: "kube-api-access-flhrb") pod "42889d55-04a4-4f78-b2fc-a88c0bc79caf" (UID: "42889d55-04a4-4f78-b2fc-a88c0bc79caf"). InnerVolumeSpecName "kube-api-access-flhrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.366350 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="42889d55-04a4-4f78-b2fc-a88c0bc79caf" path="/var/lib/kubelet/pods/42889d55-04a4-4f78-b2fc-a88c0bc79caf/volumes" Mar 19 11:20:46 crc kubenswrapper[4765]: I0319 11:20:46.375267 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-flhrb\" (UniqueName: \"kubernetes.io/projected/42889d55-04a4-4f78-b2fc-a88c0bc79caf-kube-api-access-flhrb\") on node \"crc\" DevicePath \"\"" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.004170 4765 scope.go:117] "RemoveContainer" containerID="c796cc740e296257b5817e48a670c1c95852e07a13ac844140866f450fb207af" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.004219 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-8xgbn" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.336505 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-d2cw9"] Mar 19 11:20:47 crc kubenswrapper[4765]: E0319 11:20:47.337246 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="42889d55-04a4-4f78-b2fc-a88c0bc79caf" containerName="container-00" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.337262 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="42889d55-04a4-4f78-b2fc-a88c0bc79caf" containerName="container-00" Mar 19 11:20:47 crc kubenswrapper[4765]: E0319 11:20:47.337286 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c1d1a971-9704-4789-a6b2-c6250bab2e4e" containerName="oc" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.337294 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c1d1a971-9704-4789-a6b2-c6250bab2e4e" containerName="oc" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.337531 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c1d1a971-9704-4789-a6b2-c6250bab2e4e" containerName="oc" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.337566 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="42889d55-04a4-4f78-b2fc-a88c0bc79caf" containerName="container-00" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.338220 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.494048 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkgb8\" (UniqueName: \"kubernetes.io/projected/585130ee-fa53-464e-afc9-a2eed885507e-kube-api-access-vkgb8\") pod \"crc-debug-d2cw9\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.494712 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/585130ee-fa53-464e-afc9-a2eed885507e-host\") pod \"crc-debug-d2cw9\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.597223 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vkgb8\" (UniqueName: \"kubernetes.io/projected/585130ee-fa53-464e-afc9-a2eed885507e-kube-api-access-vkgb8\") pod \"crc-debug-d2cw9\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.597368 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/585130ee-fa53-464e-afc9-a2eed885507e-host\") pod \"crc-debug-d2cw9\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.597610 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/585130ee-fa53-464e-afc9-a2eed885507e-host\") pod \"crc-debug-d2cw9\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.616085 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkgb8\" (UniqueName: \"kubernetes.io/projected/585130ee-fa53-464e-afc9-a2eed885507e-kube-api-access-vkgb8\") pod \"crc-debug-d2cw9\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:47 crc kubenswrapper[4765]: I0319 11:20:47.657886 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:48 crc kubenswrapper[4765]: I0319 11:20:48.015584 4765 generic.go:334] "Generic (PLEG): container finished" podID="585130ee-fa53-464e-afc9-a2eed885507e" containerID="71f88fe51935548aad177e586e889db157aa7613c6840c8c513f1270754befa2" exitCode=0 Mar 19 11:20:48 crc kubenswrapper[4765]: I0319 11:20:48.015684 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" event={"ID":"585130ee-fa53-464e-afc9-a2eed885507e","Type":"ContainerDied","Data":"71f88fe51935548aad177e586e889db157aa7613c6840c8c513f1270754befa2"} Mar 19 11:20:48 crc kubenswrapper[4765]: I0319 11:20:48.016015 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" event={"ID":"585130ee-fa53-464e-afc9-a2eed885507e","Type":"ContainerStarted","Data":"7d2a1e7d7147281ecbfd239213aed183ccadc674f2c5fd05abd2b1d4ddf1bbe2"} Mar 19 11:20:48 crc kubenswrapper[4765]: I0319 11:20:48.529203 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-d2cw9"] Mar 19 11:20:48 crc kubenswrapper[4765]: I0319 11:20:48.537826 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-d2cw9"] Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.132487 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.329839 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/585130ee-fa53-464e-afc9-a2eed885507e-host\") pod \"585130ee-fa53-464e-afc9-a2eed885507e\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.330099 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vkgb8\" (UniqueName: \"kubernetes.io/projected/585130ee-fa53-464e-afc9-a2eed885507e-kube-api-access-vkgb8\") pod \"585130ee-fa53-464e-afc9-a2eed885507e\" (UID: \"585130ee-fa53-464e-afc9-a2eed885507e\") " Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.331240 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/585130ee-fa53-464e-afc9-a2eed885507e-host" (OuterVolumeSpecName: "host") pod "585130ee-fa53-464e-afc9-a2eed885507e" (UID: "585130ee-fa53-464e-afc9-a2eed885507e"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.336578 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/585130ee-fa53-464e-afc9-a2eed885507e-kube-api-access-vkgb8" (OuterVolumeSpecName: "kube-api-access-vkgb8") pod "585130ee-fa53-464e-afc9-a2eed885507e" (UID: "585130ee-fa53-464e-afc9-a2eed885507e"). InnerVolumeSpecName "kube-api-access-vkgb8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.433028 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vkgb8\" (UniqueName: \"kubernetes.io/projected/585130ee-fa53-464e-afc9-a2eed885507e-kube-api-access-vkgb8\") on node \"crc\" DevicePath \"\"" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.433566 4765 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/585130ee-fa53-464e-afc9-a2eed885507e-host\") on node \"crc\" DevicePath \"\"" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.695709 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-rk5fq"] Mar 19 11:20:49 crc kubenswrapper[4765]: E0319 11:20:49.696101 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="585130ee-fa53-464e-afc9-a2eed885507e" containerName="container-00" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.696114 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="585130ee-fa53-464e-afc9-a2eed885507e" containerName="container-00" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.696368 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="585130ee-fa53-464e-afc9-a2eed885507e" containerName="container-00" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.697116 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.740097 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec384ee-632e-4be2-a5ad-be645023104d-host\") pod \"crc-debug-rk5fq\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.740461 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s2rqf\" (UniqueName: \"kubernetes.io/projected/3ec384ee-632e-4be2-a5ad-be645023104d-kube-api-access-s2rqf\") pod \"crc-debug-rk5fq\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.842155 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2rqf\" (UniqueName: \"kubernetes.io/projected/3ec384ee-632e-4be2-a5ad-be645023104d-kube-api-access-s2rqf\") pod \"crc-debug-rk5fq\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.842312 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec384ee-632e-4be2-a5ad-be645023104d-host\") pod \"crc-debug-rk5fq\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.842443 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec384ee-632e-4be2-a5ad-be645023104d-host\") pod \"crc-debug-rk5fq\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:49 crc kubenswrapper[4765]: I0319 11:20:49.858953 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2rqf\" (UniqueName: \"kubernetes.io/projected/3ec384ee-632e-4be2-a5ad-be645023104d-kube-api-access-s2rqf\") pod \"crc-debug-rk5fq\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:50 crc kubenswrapper[4765]: I0319 11:20:50.013677 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:50 crc kubenswrapper[4765]: W0319 11:20:50.040386 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3ec384ee_632e_4be2_a5ad_be645023104d.slice/crio-69d3a88d482451fadf2b6f691f45ec39f10d4423d2b098451e6494019052baf5 WatchSource:0}: Error finding container 69d3a88d482451fadf2b6f691f45ec39f10d4423d2b098451e6494019052baf5: Status 404 returned error can't find the container with id 69d3a88d482451fadf2b6f691f45ec39f10d4423d2b098451e6494019052baf5 Mar 19 11:20:50 crc kubenswrapper[4765]: I0319 11:20:50.040454 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7d2a1e7d7147281ecbfd239213aed183ccadc674f2c5fd05abd2b1d4ddf1bbe2" Mar 19 11:20:50 crc kubenswrapper[4765]: I0319 11:20:50.040501 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-d2cw9" Mar 19 11:20:50 crc kubenswrapper[4765]: I0319 11:20:50.366073 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="585130ee-fa53-464e-afc9-a2eed885507e" path="/var/lib/kubelet/pods/585130ee-fa53-464e-afc9-a2eed885507e/volumes" Mar 19 11:20:51 crc kubenswrapper[4765]: I0319 11:20:51.049831 4765 generic.go:334] "Generic (PLEG): container finished" podID="3ec384ee-632e-4be2-a5ad-be645023104d" containerID="925e135052a61f5ff9db19db77a3363ce79c03cb643464201deda79e6a367804" exitCode=0 Mar 19 11:20:51 crc kubenswrapper[4765]: I0319 11:20:51.049903 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" event={"ID":"3ec384ee-632e-4be2-a5ad-be645023104d","Type":"ContainerDied","Data":"925e135052a61f5ff9db19db77a3363ce79c03cb643464201deda79e6a367804"} Mar 19 11:20:51 crc kubenswrapper[4765]: I0319 11:20:51.050208 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" event={"ID":"3ec384ee-632e-4be2-a5ad-be645023104d","Type":"ContainerStarted","Data":"69d3a88d482451fadf2b6f691f45ec39f10d4423d2b098451e6494019052baf5"} Mar 19 11:20:51 crc kubenswrapper[4765]: I0319 11:20:51.091980 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-rk5fq"] Mar 19 11:20:51 crc kubenswrapper[4765]: I0319 11:20:51.100810 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vk8z8/crc-debug-rk5fq"] Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.167756 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.287355 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec384ee-632e-4be2-a5ad-be645023104d-host\") pod \"3ec384ee-632e-4be2-a5ad-be645023104d\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.287496 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3ec384ee-632e-4be2-a5ad-be645023104d-host" (OuterVolumeSpecName: "host") pod "3ec384ee-632e-4be2-a5ad-be645023104d" (UID: "3ec384ee-632e-4be2-a5ad-be645023104d"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.287549 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s2rqf\" (UniqueName: \"kubernetes.io/projected/3ec384ee-632e-4be2-a5ad-be645023104d-kube-api-access-s2rqf\") pod \"3ec384ee-632e-4be2-a5ad-be645023104d\" (UID: \"3ec384ee-632e-4be2-a5ad-be645023104d\") " Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.287984 4765 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3ec384ee-632e-4be2-a5ad-be645023104d-host\") on node \"crc\" DevicePath \"\"" Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.293218 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ec384ee-632e-4be2-a5ad-be645023104d-kube-api-access-s2rqf" (OuterVolumeSpecName: "kube-api-access-s2rqf") pod "3ec384ee-632e-4be2-a5ad-be645023104d" (UID: "3ec384ee-632e-4be2-a5ad-be645023104d"). InnerVolumeSpecName "kube-api-access-s2rqf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.367634 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ec384ee-632e-4be2-a5ad-be645023104d" path="/var/lib/kubelet/pods/3ec384ee-632e-4be2-a5ad-be645023104d/volumes" Mar 19 11:20:52 crc kubenswrapper[4765]: I0319 11:20:52.389627 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s2rqf\" (UniqueName: \"kubernetes.io/projected/3ec384ee-632e-4be2-a5ad-be645023104d-kube-api-access-s2rqf\") on node \"crc\" DevicePath \"\"" Mar 19 11:20:53 crc kubenswrapper[4765]: I0319 11:20:53.069515 4765 scope.go:117] "RemoveContainer" containerID="925e135052a61f5ff9db19db77a3363ce79c03cb643464201deda79e6a367804" Mar 19 11:20:53 crc kubenswrapper[4765]: I0319 11:20:53.069541 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/crc-debug-rk5fq" Mar 19 11:21:07 crc kubenswrapper[4765]: I0319 11:21:07.822714 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77865d778-4kfkp_446b5005-1960-413b-8ab2-f0da071ab4ba/barbican-api/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.023375 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6955bd84cd-t7qkv_65d1a29f-39b3-40d7-9db2-246fc05348cc/barbican-keystone-listener/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.026183 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77865d778-4kfkp_446b5005-1960-413b-8ab2-f0da071ab4ba/barbican-api-log/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.097369 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6955bd84cd-t7qkv_65d1a29f-39b3-40d7-9db2-246fc05348cc/barbican-keystone-listener-log/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.262847 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f975545dc-7gv92_b18f0688-cbc1-49ae-a721-b964e45cc1ea/barbican-worker/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.308513 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f975545dc-7gv92_b18f0688-cbc1-49ae-a721-b964e45cc1ea/barbican-worker-log/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.579148 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/ceilometer-central-agent/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.601536 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/ceilometer-notification-agent/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.629846 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm_0050f5ac-5380-49b0-98ad-fdd7c3b94f51/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.782195 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/sg-core/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.786733 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/proxy-httpd/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.912357 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7/cinder-api/0.log" Mar 19 11:21:08 crc kubenswrapper[4765]: I0319 11:21:08.980697 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7/cinder-api-log/0.log" Mar 19 11:21:09 crc kubenswrapper[4765]: I0319 11:21:09.087435 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_da64a060-18bb-4b34-9374-1fec5ad88ede/cinder-scheduler/0.log" Mar 19 11:21:09 crc kubenswrapper[4765]: I0319 11:21:09.176913 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_da64a060-18bb-4b34-9374-1fec5ad88ede/probe/0.log" Mar 19 11:21:09 crc kubenswrapper[4765]: I0319 11:21:09.314859 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw_fd32b580-78f7-478e-ba1d-9d1a86b75f3a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:09 crc kubenswrapper[4765]: I0319 11:21:09.605922 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g_546dffbe-3a15-4074-a5be-deac4d1530e3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:09 crc kubenswrapper[4765]: I0319 11:21:09.766821 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-t5pnc_8051ab1f-9c27-4c1b-b9f9-9f883c67bea9/init/0.log" Mar 19 11:21:09 crc kubenswrapper[4765]: I0319 11:21:09.880252 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-t5pnc_8051ab1f-9c27-4c1b-b9f9-9f883c67bea9/init/0.log" Mar 19 11:21:09 crc kubenswrapper[4765]: I0319 11:21:09.899228 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-t5pnc_8051ab1f-9c27-4c1b-b9f9-9f883c67bea9/dnsmasq-dns/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.021483 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr_5381bac5-1b71-4489-97fd-c49d0ae1783b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.128481 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac/glance-httpd/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.177454 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac/glance-log/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.314600 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02/glance-httpd/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.445186 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02/glance-log/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.600105 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6c6ff5646d-fmdz2_b506e362-44bf-4267-bea0-18131aa011fa/horizon/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.770334 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ppstd_ac71ef52-bfb2-44d0-be24-71e8e5e58475/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:10 crc kubenswrapper[4765]: I0319 11:21:10.888088 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6c6ff5646d-fmdz2_b506e362-44bf-4267-bea0-18131aa011fa/horizon-log/0.log" Mar 19 11:21:11 crc kubenswrapper[4765]: I0319 11:21:11.343850 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4x5vg_c09e1efc-02d2-4e0f-9e16-36d9627e0fb8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:11 crc kubenswrapper[4765]: I0319 11:21:11.364736 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565301-lwwfz_760a6c55-471d-478e-b75a-713476259c81/keystone-cron/0.log" Mar 19 11:21:11 crc kubenswrapper[4765]: I0319 11:21:11.367483 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79688b6ffc-lc92w_c59a3da3-7154-4531-9bf8-96771979b410/keystone-api/0.log" Mar 19 11:21:11 crc kubenswrapper[4765]: I0319 11:21:11.564259 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_06359a74-a7cd-45ab-bc64-ef3d71373e5a/kube-state-metrics/0.log" Mar 19 11:21:12 crc kubenswrapper[4765]: I0319 11:21:12.059485 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kplgb_895f9304-5267-4b0b-acac-7e0d279b8866/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:12 crc kubenswrapper[4765]: I0319 11:21:12.196632 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67b57ccc79-wx8k9_121bed92-a505-40d7-83f1-f3163088df2a/neutron-api/0.log" Mar 19 11:21:12 crc kubenswrapper[4765]: I0319 11:21:12.252785 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67b57ccc79-wx8k9_121bed92-a505-40d7-83f1-f3163088df2a/neutron-httpd/0.log" Mar 19 11:21:12 crc kubenswrapper[4765]: I0319 11:21:12.539838 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt_5987706f-bbd1-4eeb-908e-dd158089aea5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:12 crc kubenswrapper[4765]: I0319 11:21:12.965604 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3eb0de8e-0a7f-4324-8195-2bab8419c2ba/nova-cell0-conductor-conductor/0.log" Mar 19 11:21:12 crc kubenswrapper[4765]: I0319 11:21:12.989994 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4b32fc33-5dc9-44b4-9313-1ad458fe9473/nova-api-log/0.log" Mar 19 11:21:13 crc kubenswrapper[4765]: I0319 11:21:13.363794 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4b32fc33-5dc9-44b4-9313-1ad458fe9473/nova-api-api/0.log" Mar 19 11:21:13 crc kubenswrapper[4765]: I0319 11:21:13.491257 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1ead2164-cda8-432f-b397-2866c55ccdbd/nova-cell1-conductor-conductor/0.log" Mar 19 11:21:13 crc kubenswrapper[4765]: I0319 11:21:13.596593 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b92cc2fe-932d-4290-8331-225b2c5011d4/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 11:21:13 crc kubenswrapper[4765]: I0319 11:21:13.928182 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5603f135-db39-4e98-b372-6ec55cbc3351/nova-metadata-log/0.log" Mar 19 11:21:14 crc kubenswrapper[4765]: I0319 11:21:14.322445 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_584d891c-1a52-4300-b19a-51a3594bdccf/nova-scheduler-scheduler/0.log" Mar 19 11:21:14 crc kubenswrapper[4765]: I0319 11:21:14.342775 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5603f135-db39-4e98-b372-6ec55cbc3351/nova-metadata-metadata/0.log" Mar 19 11:21:14 crc kubenswrapper[4765]: I0319 11:21:14.380816 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-skpp9_f9cf075c-03d2-4254-9ab9-5500d4f42186/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:14 crc kubenswrapper[4765]: I0319 11:21:14.515782 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85cd8112-bca8-45df-b61a-d2690fbbfb16/mysql-bootstrap/0.log" Mar 19 11:21:14 crc kubenswrapper[4765]: I0319 11:21:14.779520 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85cd8112-bca8-45df-b61a-d2690fbbfb16/galera/0.log" Mar 19 11:21:14 crc kubenswrapper[4765]: I0319 11:21:14.794508 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adf887ce-99cf-47a0-89e8-2db5aa92a9ca/mysql-bootstrap/0.log" Mar 19 11:21:14 crc kubenswrapper[4765]: I0319 11:21:14.826773 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85cd8112-bca8-45df-b61a-d2690fbbfb16/mysql-bootstrap/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.028483 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adf887ce-99cf-47a0-89e8-2db5aa92a9ca/mysql-bootstrap/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.072506 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1cda8252-a988-49d1-a566-8d9989b86034/openstackclient/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.083056 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adf887ce-99cf-47a0-89e8-2db5aa92a9ca/galera/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.296364 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ct9xj_5272132e-561c-46b9-92c8-1714e40b3303/ovn-controller/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.430797 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-sq8vg_c2dd6b2c-bf15-47e4-b9c6-775b176fbadb/openstack-network-exporter/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.637432 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovsdb-server-init/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.872744 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovsdb-server/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.895203 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovs-vswitchd/0.log" Mar 19 11:21:15 crc kubenswrapper[4765]: I0319 11:21:15.979187 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovsdb-server-init/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.204738 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_95192c6a-3899-4f62-bfca-47ad91bd17f1/openstack-network-exporter/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.232112 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fgjhz_db0a9fa6-2229-425c-8170-ebcc7dce147f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.285588 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_95192c6a-3899-4f62-bfca-47ad91bd17f1/ovn-northd/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.478641 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_17555a74-a31f-4d09-8b23-b8c774024c58/openstack-network-exporter/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.497207 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_17555a74-a31f-4d09-8b23-b8c774024c58/ovsdbserver-nb/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.687292 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_203ad8ad-1b9e-4191-99a0-7bfd9c193de8/openstack-network-exporter/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.723105 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_203ad8ad-1b9e-4191-99a0-7bfd9c193de8/ovsdbserver-sb/0.log" Mar 19 11:21:16 crc kubenswrapper[4765]: I0319 11:21:16.887795 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54bc4cb6bd-w8bvw_21f8be56-b9b5-4205-8de4-dd4d204b9f3d/placement-api/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.031695 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54bc4cb6bd-w8bvw_21f8be56-b9b5-4205-8de4-dd4d204b9f3d/placement-log/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.118157 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c3043d68-f6dc-4095-bc0e-62b2282dd297/setup-container/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.260300 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c3043d68-f6dc-4095-bc0e-62b2282dd297/setup-container/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.351131 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c3043d68-f6dc-4095-bc0e-62b2282dd297/rabbitmq/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.611759 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53406a09-7bdd-4517-ac01-0823bce386bc/setup-container/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.886907 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53406a09-7bdd-4517-ac01-0823bce386bc/setup-container/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.900609 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53406a09-7bdd-4517-ac01-0823bce386bc/rabbitmq/0.log" Mar 19 11:21:17 crc kubenswrapper[4765]: I0319 11:21:17.910564 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt_525004be-ff4e-4c2d-ad4d-0ed018eecc09/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.137637 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zsxdz_af5d1fcd-a500-4d64-a86a-37cae82350d3/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.161195 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8_ab6c9186-ce11-4085-9c4c-c0964cb170d8/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.385536 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jcnm5_0d2d350b-0950-4e12-8ae8-57c8983079aa/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.399825 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gc6dd_f46a25a2-f362-487c-9511-b9888a18b08e/ssh-known-hosts-edpm-deployment/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.654841 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556979b4dc-zj26d_00e0de39-87cf-4a6e-8980-a294f329e430/proxy-server/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.788225 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556979b4dc-zj26d_00e0de39-87cf-4a6e-8980-a294f329e430/proxy-httpd/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.837013 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-f5w89_bd78fb4a-24b1-4fb7-8994-3668d29ff042/swift-ring-rebalance/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.963296 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-auditor/0.log" Mar 19 11:21:18 crc kubenswrapper[4765]: I0319 11:21:18.974354 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-reaper/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.131153 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-replicator/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.224158 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-server/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.229720 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-auditor/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.281210 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-replicator/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.432542 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-updater/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.446575 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-server/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.535320 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-expirer/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.551238 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-auditor/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.666949 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-server/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.732184 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-replicator/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.794112 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/rsync/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.794273 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-updater/0.log" Mar 19 11:21:19 crc kubenswrapper[4765]: I0319 11:21:19.921987 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/swift-recon-cron/0.log" Mar 19 11:21:20 crc kubenswrapper[4765]: I0319 11:21:20.139053 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_65eabf0c-0a01-4d5b-aefd-d9ce064e1d66/tempest-tests-tempest-tests-runner/0.log" Mar 19 11:21:20 crc kubenswrapper[4765]: I0319 11:21:20.243945 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9bf5c002-a318-47f6-8ba0-5a39c88daeff/test-operator-logs-container/0.log" Mar 19 11:21:20 crc kubenswrapper[4765]: I0319 11:21:20.498026 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx_3a573215-571a-49dc-9903-82134a77d196/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:20 crc kubenswrapper[4765]: I0319 11:21:20.504590 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq_611d61c3-8dd1-46e4-a579-ded4e91917ed/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:21:29 crc kubenswrapper[4765]: I0319 11:21:29.997440 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_81d90cd2-d47a-47c5-aeff-20f377ed9159/memcached/0.log" Mar 19 11:21:31 crc kubenswrapper[4765]: I0319 11:21:31.655814 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:21:31 crc kubenswrapper[4765]: I0319 11:21:31.656104 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:21:47 crc kubenswrapper[4765]: I0319 11:21:47.636247 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/util/0.log" Mar 19 11:21:47 crc kubenswrapper[4765]: I0319 11:21:47.822161 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/util/0.log" Mar 19 11:21:47 crc kubenswrapper[4765]: I0319 11:21:47.833415 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/pull/0.log" Mar 19 11:21:47 crc kubenswrapper[4765]: I0319 11:21:47.863013 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/pull/0.log" Mar 19 11:21:48 crc kubenswrapper[4765]: I0319 11:21:48.069109 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/util/0.log" Mar 19 11:21:48 crc kubenswrapper[4765]: I0319 11:21:48.082212 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/pull/0.log" Mar 19 11:21:48 crc kubenswrapper[4765]: I0319 11:21:48.105198 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/extract/0.log" Mar 19 11:21:48 crc kubenswrapper[4765]: I0319 11:21:48.330818 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-kvrz2_1954f819-78c2-46fd-a6bf-c626d50ef527/manager/0.log" Mar 19 11:21:48 crc kubenswrapper[4765]: I0319 11:21:48.545772 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-6h4tg_c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01/manager/0.log" Mar 19 11:21:48 crc kubenswrapper[4765]: I0319 11:21:48.762550 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-4zf8v_123b9f81-d315-44b3-a6ec-d777cc18ab7b/manager/0.log" Mar 19 11:21:48 crc kubenswrapper[4765]: I0319 11:21:48.769481 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-kzvs8_3e1ee5ea-abd4-4a73-840e-43fbd3732cfd/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.048493 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-qrqf8_047a8026-b206-4eb6-9630-3b550af68d3a/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.348105 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-ncc44_1d54708d-8829-411d-a632-ce3b53b7aeaa/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.354250 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-p27jn_cdeba207-ced7-4575-9c08-c001d85b0a93/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.539465 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-2gnht_0cd862fe-c896-4fa6-a9ba-b1af6441f777/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.671111 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-q9lpg_b94397e1-cedc-4048-9253-12c60b0a9bfd/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.749715 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-x8k5g_9225dfe1-877e-43a2-9034-0e355019aa04/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.851403 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-bg8b9_d9dca6f4-a577-44fa-a959-8398fb57dca0/manager/0.log" Mar 19 11:21:49 crc kubenswrapper[4765]: I0319 11:21:49.999376 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-78t28_adc31858-63eb-4d03-b79c-c1a4054725af/manager/0.log" Mar 19 11:21:50 crc kubenswrapper[4765]: I0319 11:21:50.117923 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-wxsgc_6647190b-c26b-4c57-bc84-7e5cfe6a5649/manager/0.log" Mar 19 11:21:50 crc kubenswrapper[4765]: I0319 11:21:50.202345 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-b4kfn_473e9670-e72d-4e54-8b06-9d73666cbfc0/manager/0.log" Mar 19 11:21:50 crc kubenswrapper[4765]: I0319 11:21:50.405292 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-vqghn_981806c8-2390-44ac-a6f8-81c5f5bb0374/manager/0.log" Mar 19 11:21:50 crc kubenswrapper[4765]: I0319 11:21:50.492558 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-76ccd786f6-h42th_538ce45d-9424-41e4-8d9e-ff63db0df6be/operator/0.log" Mar 19 11:21:50 crc kubenswrapper[4765]: I0319 11:21:50.727185 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fcqz8_0836451e-5b5f-47bd-8722-283ab5d34a5c/registry-server/0.log" Mar 19 11:21:51 crc kubenswrapper[4765]: I0319 11:21:51.050389 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-v78t5_412ddd32-a861-4cec-8d5e-bb21069835e9/manager/0.log" Mar 19 11:21:51 crc kubenswrapper[4765]: I0319 11:21:51.264767 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-2zbqz_4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d/manager/0.log" Mar 19 11:21:51 crc kubenswrapper[4765]: I0319 11:21:51.321088 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hsxfs_408f748b-ca2b-4ae8-8994-63d7da422df9/operator/0.log" Mar 19 11:21:51 crc kubenswrapper[4765]: I0319 11:21:51.541827 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-hfm25_c3466125-06fe-4c5d-872d-a778806a0e23/manager/0.log" Mar 19 11:21:51 crc kubenswrapper[4765]: I0319 11:21:51.917206 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-xdlrz_e843ba99-8859-41d7-9142-ed9227b4d8e1/manager/0.log" Mar 19 11:21:52 crc kubenswrapper[4765]: I0319 11:21:52.037269 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-7wsnh_ae2caf34-b7b2-486c-9e9e-a48cf04eed87/manager/0.log" Mar 19 11:21:52 crc kubenswrapper[4765]: I0319 11:21:52.041899 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c9c9c96bc-4hzdg_970bc693-0463-4dfe-8870-fac695fffcae/manager/0.log" Mar 19 11:21:52 crc kubenswrapper[4765]: I0319 11:21:52.202235 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-dbv47_3c3ef321-6a40-4d2e-a414-ad6a65cd32cf/manager/0.log" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.141973 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565322-d6wnz"] Mar 19 11:22:00 crc kubenswrapper[4765]: E0319 11:22:00.142971 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ec384ee-632e-4be2-a5ad-be645023104d" containerName="container-00" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.142988 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ec384ee-632e-4be2-a5ad-be645023104d" containerName="container-00" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.143236 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ec384ee-632e-4be2-a5ad-be645023104d" containerName="container-00" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.143976 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565322-d6wnz" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.146245 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.146563 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.146751 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.151882 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565322-d6wnz"] Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.283493 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dw79k\" (UniqueName: \"kubernetes.io/projected/af1449ee-08fd-46db-a3eb-e136c881b7e0-kube-api-access-dw79k\") pod \"auto-csr-approver-29565322-d6wnz\" (UID: \"af1449ee-08fd-46db-a3eb-e136c881b7e0\") " pod="openshift-infra/auto-csr-approver-29565322-d6wnz" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.385489 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dw79k\" (UniqueName: \"kubernetes.io/projected/af1449ee-08fd-46db-a3eb-e136c881b7e0-kube-api-access-dw79k\") pod \"auto-csr-approver-29565322-d6wnz\" (UID: \"af1449ee-08fd-46db-a3eb-e136c881b7e0\") " pod="openshift-infra/auto-csr-approver-29565322-d6wnz" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.412841 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dw79k\" (UniqueName: \"kubernetes.io/projected/af1449ee-08fd-46db-a3eb-e136c881b7e0-kube-api-access-dw79k\") pod \"auto-csr-approver-29565322-d6wnz\" (UID: \"af1449ee-08fd-46db-a3eb-e136c881b7e0\") " pod="openshift-infra/auto-csr-approver-29565322-d6wnz" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.467305 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565322-d6wnz" Mar 19 11:22:00 crc kubenswrapper[4765]: I0319 11:22:00.910669 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565322-d6wnz"] Mar 19 11:22:01 crc kubenswrapper[4765]: I0319 11:22:01.656066 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:22:01 crc kubenswrapper[4765]: I0319 11:22:01.656124 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:22:01 crc kubenswrapper[4765]: I0319 11:22:01.717282 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565322-d6wnz" event={"ID":"af1449ee-08fd-46db-a3eb-e136c881b7e0","Type":"ContainerStarted","Data":"ed5ad774714acd34b23389ff10872b853c7e2ddb4e161e674922d97f4aa8e401"} Mar 19 11:22:02 crc kubenswrapper[4765]: I0319 11:22:02.726484 4765 generic.go:334] "Generic (PLEG): container finished" podID="af1449ee-08fd-46db-a3eb-e136c881b7e0" containerID="f8ee3491268a44a22190e7d95703d22dea1e44dc97d73190c35fd16db729d67e" exitCode=0 Mar 19 11:22:02 crc kubenswrapper[4765]: I0319 11:22:02.726577 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565322-d6wnz" event={"ID":"af1449ee-08fd-46db-a3eb-e136c881b7e0","Type":"ContainerDied","Data":"f8ee3491268a44a22190e7d95703d22dea1e44dc97d73190c35fd16db729d67e"} Mar 19 11:22:04 crc kubenswrapper[4765]: I0319 11:22:04.110217 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565322-d6wnz" Mar 19 11:22:04 crc kubenswrapper[4765]: I0319 11:22:04.263280 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dw79k\" (UniqueName: \"kubernetes.io/projected/af1449ee-08fd-46db-a3eb-e136c881b7e0-kube-api-access-dw79k\") pod \"af1449ee-08fd-46db-a3eb-e136c881b7e0\" (UID: \"af1449ee-08fd-46db-a3eb-e136c881b7e0\") " Mar 19 11:22:04 crc kubenswrapper[4765]: I0319 11:22:04.268838 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/af1449ee-08fd-46db-a3eb-e136c881b7e0-kube-api-access-dw79k" (OuterVolumeSpecName: "kube-api-access-dw79k") pod "af1449ee-08fd-46db-a3eb-e136c881b7e0" (UID: "af1449ee-08fd-46db-a3eb-e136c881b7e0"). InnerVolumeSpecName "kube-api-access-dw79k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:22:04 crc kubenswrapper[4765]: I0319 11:22:04.365229 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dw79k\" (UniqueName: \"kubernetes.io/projected/af1449ee-08fd-46db-a3eb-e136c881b7e0-kube-api-access-dw79k\") on node \"crc\" DevicePath \"\"" Mar 19 11:22:04 crc kubenswrapper[4765]: I0319 11:22:04.743868 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565322-d6wnz" event={"ID":"af1449ee-08fd-46db-a3eb-e136c881b7e0","Type":"ContainerDied","Data":"ed5ad774714acd34b23389ff10872b853c7e2ddb4e161e674922d97f4aa8e401"} Mar 19 11:22:04 crc kubenswrapper[4765]: I0319 11:22:04.743919 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ed5ad774714acd34b23389ff10872b853c7e2ddb4e161e674922d97f4aa8e401" Mar 19 11:22:04 crc kubenswrapper[4765]: I0319 11:22:04.743952 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565322-d6wnz" Mar 19 11:22:05 crc kubenswrapper[4765]: I0319 11:22:05.180842 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565316-nw4lf"] Mar 19 11:22:05 crc kubenswrapper[4765]: I0319 11:22:05.191289 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565316-nw4lf"] Mar 19 11:22:06 crc kubenswrapper[4765]: I0319 11:22:06.390096 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96556ca1-43dd-4913-b94c-48ef7d7d00b0" path="/var/lib/kubelet/pods/96556ca1-43dd-4913-b94c-48ef7d7d00b0/volumes" Mar 19 11:22:10 crc kubenswrapper[4765]: I0319 11:22:10.554766 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jdqns_dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0/control-plane-machine-set-operator/0.log" Mar 19 11:22:10 crc kubenswrapper[4765]: I0319 11:22:10.680919 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hqvg_993cdcd1-8323-49aa-b587-5a8c344a2077/kube-rbac-proxy/0.log" Mar 19 11:22:10 crc kubenswrapper[4765]: I0319 11:22:10.752762 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hqvg_993cdcd1-8323-49aa-b587-5a8c344a2077/machine-api-operator/0.log" Mar 19 11:22:16 crc kubenswrapper[4765]: I0319 11:22:16.041382 4765 scope.go:117] "RemoveContainer" containerID="310dd71db96197a0ac359aa87454e087fd00caa995716204cd037a442e9fa1a5" Mar 19 11:22:22 crc kubenswrapper[4765]: I0319 11:22:22.841866 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nb6dw_4a6670fe-5988-4bfd-8468-b2a5f6cd9997/cert-manager-controller/0.log" Mar 19 11:22:23 crc kubenswrapper[4765]: I0319 11:22:23.009376 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-tp66g_67148642-28c7-4217-b91a-3badb42c4c38/cert-manager-cainjector/0.log" Mar 19 11:22:23 crc kubenswrapper[4765]: I0319 11:22:23.080777 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-w862m_1b499d05-d228-4268-8b1f-8b3c8687870f/cert-manager-webhook/0.log" Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.656379 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.657014 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.657081 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.657912 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9f428a0136b444a0c21f9c6c085da235b98f398bb53eda25b0fa7e3ce28d5318"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.657977 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://9f428a0136b444a0c21f9c6c085da235b98f398bb53eda25b0fa7e3ce28d5318" gracePeriod=600 Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.982194 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="9f428a0136b444a0c21f9c6c085da235b98f398bb53eda25b0fa7e3ce28d5318" exitCode=0 Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.982519 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"9f428a0136b444a0c21f9c6c085da235b98f398bb53eda25b0fa7e3ce28d5318"} Mar 19 11:22:31 crc kubenswrapper[4765]: I0319 11:22:31.982579 4765 scope.go:117] "RemoveContainer" containerID="6d1ba1ee2f0fbd22414a07a7f930ff26251a55c2e5c85d33bafbf0ce9a2fd78c" Mar 19 11:22:32 crc kubenswrapper[4765]: I0319 11:22:32.992255 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e"} Mar 19 11:22:34 crc kubenswrapper[4765]: I0319 11:22:34.828524 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-x2vbf_76583484-f8aa-4a95-8450-206a93fb2b6c/nmstate-console-plugin/0.log" Mar 19 11:22:35 crc kubenswrapper[4765]: I0319 11:22:35.057317 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-n2hqs_e3d573a8-f56f-45e5-9905-5810e82af6ac/nmstate-handler/0.log" Mar 19 11:22:35 crc kubenswrapper[4765]: I0319 11:22:35.136042 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z6jbc_5a5acf7a-e38f-4ef1-9576-eae0d8e9a582/kube-rbac-proxy/0.log" Mar 19 11:22:35 crc kubenswrapper[4765]: I0319 11:22:35.213013 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z6jbc_5a5acf7a-e38f-4ef1-9576-eae0d8e9a582/nmstate-metrics/0.log" Mar 19 11:22:35 crc kubenswrapper[4765]: I0319 11:22:35.293611 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-9bsj5_75557987-e600-4f26-b66a-45a76da143cf/nmstate-operator/0.log" Mar 19 11:22:35 crc kubenswrapper[4765]: I0319 11:22:35.408904 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rl6q7_ac4e199b-7261-41f0-b9e9-51b167be05a7/nmstate-webhook/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.155075 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2rmkf_0f10db70-5575-427a-b0de-f36a4c0a5feb/kube-rbac-proxy/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.308865 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2rmkf_0f10db70-5575-427a-b0de-f36a4c0a5feb/controller/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.377642 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.572873 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.599574 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.622675 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.653807 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.822558 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.859040 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.859257 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:23:02 crc kubenswrapper[4765]: I0319 11:23:02.902000 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.097419 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.099917 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.101050 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/controller/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.113183 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.338565 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/frr-metrics/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.364201 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/kube-rbac-proxy/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.397893 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/kube-rbac-proxy-frr/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.661722 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/reloader/0.log" Mar 19 11:23:03 crc kubenswrapper[4765]: I0319 11:23:03.687949 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-f6fhw_bff02354-3273-4396-b996-06a749a9692f/frr-k8s-webhook-server/0.log" Mar 19 11:23:04 crc kubenswrapper[4765]: I0319 11:23:04.152401 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b6fd59d6c-c8kfg_fc004464-2eb9-4b7d-addf-91b7b69e01b6/manager/0.log" Mar 19 11:23:04 crc kubenswrapper[4765]: I0319 11:23:04.210165 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-757bfbd67b-zzdvj_d3f409a1-a978-47fc-9907-fec4a720ae18/webhook-server/0.log" Mar 19 11:23:04 crc kubenswrapper[4765]: I0319 11:23:04.477983 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-czcsr_afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71/kube-rbac-proxy/0.log" Mar 19 11:23:04 crc kubenswrapper[4765]: I0319 11:23:04.880575 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/frr/0.log" Mar 19 11:23:04 crc kubenswrapper[4765]: I0319 11:23:04.954384 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-czcsr_afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71/speaker/0.log" Mar 19 11:23:17 crc kubenswrapper[4765]: I0319 11:23:17.782265 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/util/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.010758 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/util/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.096632 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/pull/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.100671 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/pull/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.267518 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/util/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.304559 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/extract/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.352841 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/pull/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.470599 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/util/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.668466 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/pull/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.703284 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/pull/0.log" Mar 19 11:23:18 crc kubenswrapper[4765]: I0319 11:23:18.869054 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/util/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.011796 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/util/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.030951 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/extract/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.048723 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/pull/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.194951 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-utilities/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.362073 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-content/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.393843 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-content/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.410251 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-utilities/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.544807 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-utilities/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.631694 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-content/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.842311 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-utilities/0.log" Mar 19 11:23:19 crc kubenswrapper[4765]: I0319 11:23:19.996948 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-content/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.034156 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-utilities/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.102505 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-content/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.284600 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/registry-server/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.376425 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-content/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.377228 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-utilities/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.577011 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6xvzl_5e3d8e97-79f8-43d2-acf6-f20ef33cadd3/marketplace-operator/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.587324 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/registry-server/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.755196 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-utilities/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.942390 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-utilities/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.986825 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-content/0.log" Mar 19 11:23:20 crc kubenswrapper[4765]: I0319 11:23:20.994064 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-content/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.149911 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-content/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.168625 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-utilities/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.348258 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/registry-server/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.399283 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-utilities/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.601784 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-content/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.620080 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-utilities/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.646399 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-content/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.792050 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-utilities/0.log" Mar 19 11:23:21 crc kubenswrapper[4765]: I0319 11:23:21.798159 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-content/0.log" Mar 19 11:23:22 crc kubenswrapper[4765]: I0319 11:23:22.127764 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/registry-server/0.log" Mar 19 11:23:58 crc kubenswrapper[4765]: E0319 11:23:58.268478 4765 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.13:57830->38.129.56.13:35029: write tcp 38.129.56.13:57830->38.129.56.13:35029: write: connection reset by peer Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.157525 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565324-52nm7"] Mar 19 11:24:00 crc kubenswrapper[4765]: E0319 11:24:00.158403 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="af1449ee-08fd-46db-a3eb-e136c881b7e0" containerName="oc" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.158418 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="af1449ee-08fd-46db-a3eb-e136c881b7e0" containerName="oc" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.158655 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="af1449ee-08fd-46db-a3eb-e136c881b7e0" containerName="oc" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.159475 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565324-52nm7" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.162689 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.162695 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.163614 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.169505 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565324-52nm7"] Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.252198 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ztrvd\" (UniqueName: \"kubernetes.io/projected/ad6ae4f9-0eef-4a46-b395-bc07375a436c-kube-api-access-ztrvd\") pod \"auto-csr-approver-29565324-52nm7\" (UID: \"ad6ae4f9-0eef-4a46-b395-bc07375a436c\") " pod="openshift-infra/auto-csr-approver-29565324-52nm7" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.353929 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ztrvd\" (UniqueName: \"kubernetes.io/projected/ad6ae4f9-0eef-4a46-b395-bc07375a436c-kube-api-access-ztrvd\") pod \"auto-csr-approver-29565324-52nm7\" (UID: \"ad6ae4f9-0eef-4a46-b395-bc07375a436c\") " pod="openshift-infra/auto-csr-approver-29565324-52nm7" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.374567 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ztrvd\" (UniqueName: \"kubernetes.io/projected/ad6ae4f9-0eef-4a46-b395-bc07375a436c-kube-api-access-ztrvd\") pod \"auto-csr-approver-29565324-52nm7\" (UID: \"ad6ae4f9-0eef-4a46-b395-bc07375a436c\") " pod="openshift-infra/auto-csr-approver-29565324-52nm7" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.480764 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565324-52nm7" Mar 19 11:24:00 crc kubenswrapper[4765]: I0319 11:24:00.921487 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565324-52nm7"] Mar 19 11:24:01 crc kubenswrapper[4765]: I0319 11:24:01.835912 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565324-52nm7" event={"ID":"ad6ae4f9-0eef-4a46-b395-bc07375a436c","Type":"ContainerStarted","Data":"5800bcba50bd19d22cf5b0097cb61ea3eda9fe3a1515c6b1c853f325c88d3ca6"} Mar 19 11:24:02 crc kubenswrapper[4765]: I0319 11:24:02.850154 4765 generic.go:334] "Generic (PLEG): container finished" podID="ad6ae4f9-0eef-4a46-b395-bc07375a436c" containerID="ad104457548f180ec2ec3f0bb98a753afc784404a036958379594b94817b4bd4" exitCode=0 Mar 19 11:24:02 crc kubenswrapper[4765]: I0319 11:24:02.850447 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565324-52nm7" event={"ID":"ad6ae4f9-0eef-4a46-b395-bc07375a436c","Type":"ContainerDied","Data":"ad104457548f180ec2ec3f0bb98a753afc784404a036958379594b94817b4bd4"} Mar 19 11:24:04 crc kubenswrapper[4765]: I0319 11:24:04.289404 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565324-52nm7" Mar 19 11:24:04 crc kubenswrapper[4765]: I0319 11:24:04.343698 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ztrvd\" (UniqueName: \"kubernetes.io/projected/ad6ae4f9-0eef-4a46-b395-bc07375a436c-kube-api-access-ztrvd\") pod \"ad6ae4f9-0eef-4a46-b395-bc07375a436c\" (UID: \"ad6ae4f9-0eef-4a46-b395-bc07375a436c\") " Mar 19 11:24:04 crc kubenswrapper[4765]: I0319 11:24:04.351890 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad6ae4f9-0eef-4a46-b395-bc07375a436c-kube-api-access-ztrvd" (OuterVolumeSpecName: "kube-api-access-ztrvd") pod "ad6ae4f9-0eef-4a46-b395-bc07375a436c" (UID: "ad6ae4f9-0eef-4a46-b395-bc07375a436c"). InnerVolumeSpecName "kube-api-access-ztrvd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:24:04 crc kubenswrapper[4765]: I0319 11:24:04.445812 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ztrvd\" (UniqueName: \"kubernetes.io/projected/ad6ae4f9-0eef-4a46-b395-bc07375a436c-kube-api-access-ztrvd\") on node \"crc\" DevicePath \"\"" Mar 19 11:24:04 crc kubenswrapper[4765]: I0319 11:24:04.870532 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565324-52nm7" event={"ID":"ad6ae4f9-0eef-4a46-b395-bc07375a436c","Type":"ContainerDied","Data":"5800bcba50bd19d22cf5b0097cb61ea3eda9fe3a1515c6b1c853f325c88d3ca6"} Mar 19 11:24:04 crc kubenswrapper[4765]: I0319 11:24:04.870572 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5800bcba50bd19d22cf5b0097cb61ea3eda9fe3a1515c6b1c853f325c88d3ca6" Mar 19 11:24:04 crc kubenswrapper[4765]: I0319 11:24:04.870926 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565324-52nm7" Mar 19 11:24:05 crc kubenswrapper[4765]: I0319 11:24:05.368793 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565318-68f87"] Mar 19 11:24:05 crc kubenswrapper[4765]: I0319 11:24:05.378666 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565318-68f87"] Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.133676 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-9gp76"] Mar 19 11:24:06 crc kubenswrapper[4765]: E0319 11:24:06.134462 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ad6ae4f9-0eef-4a46-b395-bc07375a436c" containerName="oc" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.134487 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad6ae4f9-0eef-4a46-b395-bc07375a436c" containerName="oc" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.134729 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="ad6ae4f9-0eef-4a46-b395-bc07375a436c" containerName="oc" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.136745 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.150070 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gp76"] Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.192297 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l44m\" (UniqueName: \"kubernetes.io/projected/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-kube-api-access-4l44m\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.192422 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-utilities\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.192491 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-catalog-content\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.294346 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l44m\" (UniqueName: \"kubernetes.io/projected/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-kube-api-access-4l44m\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.294480 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-utilities\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.294540 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-catalog-content\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.295213 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-catalog-content\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.295297 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-utilities\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.315858 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l44m\" (UniqueName: \"kubernetes.io/projected/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-kube-api-access-4l44m\") pod \"redhat-marketplace-9gp76\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.370322 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f2a323c-cc66-49f2-b43d-6fc8bb970dd3" path="/var/lib/kubelet/pods/5f2a323c-cc66-49f2-b43d-6fc8bb970dd3/volumes" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.458916 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:06 crc kubenswrapper[4765]: I0319 11:24:06.986073 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gp76"] Mar 19 11:24:07 crc kubenswrapper[4765]: I0319 11:24:07.898766 4765 generic.go:334] "Generic (PLEG): container finished" podID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerID="f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b" exitCode=0 Mar 19 11:24:07 crc kubenswrapper[4765]: I0319 11:24:07.899192 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gp76" event={"ID":"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c","Type":"ContainerDied","Data":"f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b"} Mar 19 11:24:07 crc kubenswrapper[4765]: I0319 11:24:07.899226 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gp76" event={"ID":"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c","Type":"ContainerStarted","Data":"2e735d0238a0bdf1d7daacef3f9582b7cce91cbbcabd460dbc09a01d81d5ea2d"} Mar 19 11:24:09 crc kubenswrapper[4765]: I0319 11:24:09.920003 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gp76" event={"ID":"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c","Type":"ContainerStarted","Data":"4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b"} Mar 19 11:24:11 crc kubenswrapper[4765]: I0319 11:24:11.942801 4765 generic.go:334] "Generic (PLEG): container finished" podID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerID="4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b" exitCode=0 Mar 19 11:24:11 crc kubenswrapper[4765]: I0319 11:24:11.942864 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gp76" event={"ID":"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c","Type":"ContainerDied","Data":"4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b"} Mar 19 11:24:13 crc kubenswrapper[4765]: I0319 11:24:13.962497 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gp76" event={"ID":"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c","Type":"ContainerStarted","Data":"712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda"} Mar 19 11:24:13 crc kubenswrapper[4765]: I0319 11:24:13.986921 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-9gp76" podStartSLOduration=2.772410919 podStartE2EDuration="7.986903427s" podCreationTimestamp="2026-03-19 11:24:06 +0000 UTC" firstStartedPulling="2026-03-19 11:24:07.901255779 +0000 UTC m=+3746.250201321" lastFinishedPulling="2026-03-19 11:24:13.115748287 +0000 UTC m=+3751.464693829" observedRunningTime="2026-03-19 11:24:13.980427563 +0000 UTC m=+3752.329373115" watchObservedRunningTime="2026-03-19 11:24:13.986903427 +0000 UTC m=+3752.335848969" Mar 19 11:24:16 crc kubenswrapper[4765]: I0319 11:24:16.154698 4765 scope.go:117] "RemoveContainer" containerID="6fc5a138083539101dd6ef835737b7db82f036582c816fe1daae206a9f55f116" Mar 19 11:24:16 crc kubenswrapper[4765]: I0319 11:24:16.460238 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:16 crc kubenswrapper[4765]: I0319 11:24:16.460289 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:16 crc kubenswrapper[4765]: I0319 11:24:16.516345 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:26 crc kubenswrapper[4765]: I0319 11:24:26.514504 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:26 crc kubenswrapper[4765]: I0319 11:24:26.577514 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gp76"] Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.093878 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-9gp76" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="registry-server" containerID="cri-o://712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda" gracePeriod=2 Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.583366 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.747455 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l44m\" (UniqueName: \"kubernetes.io/projected/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-kube-api-access-4l44m\") pod \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.747684 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-catalog-content\") pod \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.747763 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-utilities\") pod \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\" (UID: \"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c\") " Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.748606 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-utilities" (OuterVolumeSpecName: "utilities") pod "d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" (UID: "d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.753697 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-kube-api-access-4l44m" (OuterVolumeSpecName: "kube-api-access-4l44m") pod "d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" (UID: "d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c"). InnerVolumeSpecName "kube-api-access-4l44m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.776535 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" (UID: "d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.850656 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l44m\" (UniqueName: \"kubernetes.io/projected/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-kube-api-access-4l44m\") on node \"crc\" DevicePath \"\"" Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.850697 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:24:27 crc kubenswrapper[4765]: I0319 11:24:27.850710 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.104404 4765 generic.go:334] "Generic (PLEG): container finished" podID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerID="712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda" exitCode=0 Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.104450 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gp76" event={"ID":"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c","Type":"ContainerDied","Data":"712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda"} Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.104477 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-9gp76" event={"ID":"d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c","Type":"ContainerDied","Data":"2e735d0238a0bdf1d7daacef3f9582b7cce91cbbcabd460dbc09a01d81d5ea2d"} Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.104476 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-9gp76" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.104495 4765 scope.go:117] "RemoveContainer" containerID="712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.138593 4765 scope.go:117] "RemoveContainer" containerID="4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.156553 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gp76"] Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.165743 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-9gp76"] Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.167521 4765 scope.go:117] "RemoveContainer" containerID="f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.227813 4765 scope.go:117] "RemoveContainer" containerID="712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda" Mar 19 11:24:28 crc kubenswrapper[4765]: E0319 11:24:28.228504 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda\": container with ID starting with 712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda not found: ID does not exist" containerID="712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.228569 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda"} err="failed to get container status \"712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda\": rpc error: code = NotFound desc = could not find container \"712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda\": container with ID starting with 712db545786b25d09bcbb66f0e18bd90e42cd67abc609ea694be7e9d90da8bda not found: ID does not exist" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.228604 4765 scope.go:117] "RemoveContainer" containerID="4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b" Mar 19 11:24:28 crc kubenswrapper[4765]: E0319 11:24:28.229420 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b\": container with ID starting with 4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b not found: ID does not exist" containerID="4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.229486 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b"} err="failed to get container status \"4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b\": rpc error: code = NotFound desc = could not find container \"4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b\": container with ID starting with 4232d82032442a67c7470e4dcebfc67ec60537ce237818095dea3c0d93279e2b not found: ID does not exist" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.229994 4765 scope.go:117] "RemoveContainer" containerID="f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b" Mar 19 11:24:28 crc kubenswrapper[4765]: E0319 11:24:28.230527 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b\": container with ID starting with f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b not found: ID does not exist" containerID="f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.230558 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b"} err="failed to get container status \"f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b\": rpc error: code = NotFound desc = could not find container \"f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b\": container with ID starting with f9cc049cb46f8f521b278cd23749a7a90830b91c6568ad52daa4f5c4dcf1d67b not found: ID does not exist" Mar 19 11:24:28 crc kubenswrapper[4765]: I0319 11:24:28.368665 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" path="/var/lib/kubelet/pods/d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c/volumes" Mar 19 11:24:31 crc kubenswrapper[4765]: I0319 11:24:31.656438 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:24:31 crc kubenswrapper[4765]: I0319 11:24:31.656784 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:25:01 crc kubenswrapper[4765]: I0319 11:25:01.656023 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:25:01 crc kubenswrapper[4765]: I0319 11:25:01.656606 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:25:17 crc kubenswrapper[4765]: I0319 11:25:17.607739 4765 generic.go:334] "Generic (PLEG): container finished" podID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerID="5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a" exitCode=0 Mar 19 11:25:17 crc kubenswrapper[4765]: I0319 11:25:17.607866 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" event={"ID":"db728e4f-3565-48a8-a6ad-7b725e672b93","Type":"ContainerDied","Data":"5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a"} Mar 19 11:25:17 crc kubenswrapper[4765]: I0319 11:25:17.614871 4765 scope.go:117] "RemoveContainer" containerID="5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a" Mar 19 11:25:17 crc kubenswrapper[4765]: I0319 11:25:17.698083 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vk8z8_must-gather-bn4w7_db728e4f-3565-48a8-a6ad-7b725e672b93/gather/0.log" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.079571 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-vk8z8/must-gather-bn4w7"] Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.082037 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerName="copy" containerID="cri-o://e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b" gracePeriod=2 Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.087722 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-vk8z8/must-gather-bn4w7"] Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.629108 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vk8z8_must-gather-bn4w7_db728e4f-3565-48a8-a6ad-7b725e672b93/copy/0.log" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.629730 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.712881 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-vk8z8_must-gather-bn4w7_db728e4f-3565-48a8-a6ad-7b725e672b93/copy/0.log" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.713442 4765 generic.go:334] "Generic (PLEG): container finished" podID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerID="e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b" exitCode=143 Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.713546 4765 scope.go:117] "RemoveContainer" containerID="e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.713540 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-vk8z8/must-gather-bn4w7" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.740874 4765 scope.go:117] "RemoveContainer" containerID="5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.743283 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db728e4f-3565-48a8-a6ad-7b725e672b93-must-gather-output\") pod \"db728e4f-3565-48a8-a6ad-7b725e672b93\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.743372 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r8jrv\" (UniqueName: \"kubernetes.io/projected/db728e4f-3565-48a8-a6ad-7b725e672b93-kube-api-access-r8jrv\") pod \"db728e4f-3565-48a8-a6ad-7b725e672b93\" (UID: \"db728e4f-3565-48a8-a6ad-7b725e672b93\") " Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.753953 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/db728e4f-3565-48a8-a6ad-7b725e672b93-kube-api-access-r8jrv" (OuterVolumeSpecName: "kube-api-access-r8jrv") pod "db728e4f-3565-48a8-a6ad-7b725e672b93" (UID: "db728e4f-3565-48a8-a6ad-7b725e672b93"). InnerVolumeSpecName "kube-api-access-r8jrv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.840067 4765 scope.go:117] "RemoveContainer" containerID="e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b" Mar 19 11:25:26 crc kubenswrapper[4765]: E0319 11:25:26.841292 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b\": container with ID starting with e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b not found: ID does not exist" containerID="e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.841338 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b"} err="failed to get container status \"e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b\": rpc error: code = NotFound desc = could not find container \"e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b\": container with ID starting with e533d82f4b8e23e3dce5b308ccddfc07652c194c518aef279b6c04022ca4bf7b not found: ID does not exist" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.841368 4765 scope.go:117] "RemoveContainer" containerID="5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a" Mar 19 11:25:26 crc kubenswrapper[4765]: E0319 11:25:26.841997 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a\": container with ID starting with 5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a not found: ID does not exist" containerID="5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.842117 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a"} err="failed to get container status \"5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a\": rpc error: code = NotFound desc = could not find container \"5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a\": container with ID starting with 5cce4dd943518d2733f1343d0221ca9096b250e1496e862371fa404e565e916a not found: ID does not exist" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.845560 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r8jrv\" (UniqueName: \"kubernetes.io/projected/db728e4f-3565-48a8-a6ad-7b725e672b93-kube-api-access-r8jrv\") on node \"crc\" DevicePath \"\"" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.920200 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/db728e4f-3565-48a8-a6ad-7b725e672b93-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "db728e4f-3565-48a8-a6ad-7b725e672b93" (UID: "db728e4f-3565-48a8-a6ad-7b725e672b93"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:25:26 crc kubenswrapper[4765]: I0319 11:25:26.947344 4765 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/db728e4f-3565-48a8-a6ad-7b725e672b93-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 11:25:28 crc kubenswrapper[4765]: I0319 11:25:28.366023 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" path="/var/lib/kubelet/pods/db728e4f-3565-48a8-a6ad-7b725e672b93/volumes" Mar 19 11:25:31 crc kubenswrapper[4765]: I0319 11:25:31.656443 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:25:31 crc kubenswrapper[4765]: I0319 11:25:31.656907 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:25:31 crc kubenswrapper[4765]: I0319 11:25:31.656987 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 11:25:31 crc kubenswrapper[4765]: I0319 11:25:31.657721 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 11:25:31 crc kubenswrapper[4765]: I0319 11:25:31.657778 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" gracePeriod=600 Mar 19 11:25:31 crc kubenswrapper[4765]: E0319 11:25:31.780495 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:25:32 crc kubenswrapper[4765]: I0319 11:25:32.764496 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" exitCode=0 Mar 19 11:25:32 crc kubenswrapper[4765]: I0319 11:25:32.764542 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e"} Mar 19 11:25:32 crc kubenswrapper[4765]: I0319 11:25:32.764584 4765 scope.go:117] "RemoveContainer" containerID="9f428a0136b444a0c21f9c6c085da235b98f398bb53eda25b0fa7e3ce28d5318" Mar 19 11:25:32 crc kubenswrapper[4765]: I0319 11:25:32.765266 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:25:32 crc kubenswrapper[4765]: E0319 11:25:32.765548 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:25:44 crc kubenswrapper[4765]: I0319 11:25:44.356296 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:25:44 crc kubenswrapper[4765]: E0319 11:25:44.357736 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:25:55 crc kubenswrapper[4765]: I0319 11:25:55.356493 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:25:55 crc kubenswrapper[4765]: E0319 11:25:55.357826 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.147889 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565326-4vw7m"] Mar 19 11:26:00 crc kubenswrapper[4765]: E0319 11:26:00.149012 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerName="gather" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149027 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerName="gather" Mar 19 11:26:00 crc kubenswrapper[4765]: E0319 11:26:00.149037 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerName="copy" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149043 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerName="copy" Mar 19 11:26:00 crc kubenswrapper[4765]: E0319 11:26:00.149049 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="registry-server" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149055 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="registry-server" Mar 19 11:26:00 crc kubenswrapper[4765]: E0319 11:26:00.149067 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="extract-content" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149072 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="extract-content" Mar 19 11:26:00 crc kubenswrapper[4765]: E0319 11:26:00.149085 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="extract-utilities" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149091 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="extract-utilities" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149308 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerName="copy" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149353 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="db728e4f-3565-48a8-a6ad-7b725e672b93" containerName="gather" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.149366 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8ac529a-254d-4f93-b3e7-cc5db8ce3f0c" containerName="registry-server" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.150132 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565326-4vw7m" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.152469 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.152622 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.152648 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.157045 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565326-4vw7m"] Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.337186 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w47ln\" (UniqueName: \"kubernetes.io/projected/7d11b837-8c43-45ee-b838-71d13a8b0919-kube-api-access-w47ln\") pod \"auto-csr-approver-29565326-4vw7m\" (UID: \"7d11b837-8c43-45ee-b838-71d13a8b0919\") " pod="openshift-infra/auto-csr-approver-29565326-4vw7m" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.439128 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w47ln\" (UniqueName: \"kubernetes.io/projected/7d11b837-8c43-45ee-b838-71d13a8b0919-kube-api-access-w47ln\") pod \"auto-csr-approver-29565326-4vw7m\" (UID: \"7d11b837-8c43-45ee-b838-71d13a8b0919\") " pod="openshift-infra/auto-csr-approver-29565326-4vw7m" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.459136 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w47ln\" (UniqueName: \"kubernetes.io/projected/7d11b837-8c43-45ee-b838-71d13a8b0919-kube-api-access-w47ln\") pod \"auto-csr-approver-29565326-4vw7m\" (UID: \"7d11b837-8c43-45ee-b838-71d13a8b0919\") " pod="openshift-infra/auto-csr-approver-29565326-4vw7m" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.478986 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565326-4vw7m" Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.916164 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565326-4vw7m"] Mar 19 11:26:00 crc kubenswrapper[4765]: I0319 11:26:00.923015 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:26:01 crc kubenswrapper[4765]: I0319 11:26:01.061888 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565326-4vw7m" event={"ID":"7d11b837-8c43-45ee-b838-71d13a8b0919","Type":"ContainerStarted","Data":"6c715b595698cc91e68ea366d99ea043b9bbc8a7147ecf38d986123062dcad4b"} Mar 19 11:26:03 crc kubenswrapper[4765]: I0319 11:26:03.084190 4765 generic.go:334] "Generic (PLEG): container finished" podID="7d11b837-8c43-45ee-b838-71d13a8b0919" containerID="4ad4991be769ac3ad1abcf171d281476125074255f30dae44bdabe9adfa7efa1" exitCode=0 Mar 19 11:26:03 crc kubenswrapper[4765]: I0319 11:26:03.084411 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565326-4vw7m" event={"ID":"7d11b837-8c43-45ee-b838-71d13a8b0919","Type":"ContainerDied","Data":"4ad4991be769ac3ad1abcf171d281476125074255f30dae44bdabe9adfa7efa1"} Mar 19 11:26:04 crc kubenswrapper[4765]: I0319 11:26:04.478530 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565326-4vw7m" Mar 19 11:26:04 crc kubenswrapper[4765]: I0319 11:26:04.630643 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w47ln\" (UniqueName: \"kubernetes.io/projected/7d11b837-8c43-45ee-b838-71d13a8b0919-kube-api-access-w47ln\") pod \"7d11b837-8c43-45ee-b838-71d13a8b0919\" (UID: \"7d11b837-8c43-45ee-b838-71d13a8b0919\") " Mar 19 11:26:04 crc kubenswrapper[4765]: I0319 11:26:04.636166 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7d11b837-8c43-45ee-b838-71d13a8b0919-kube-api-access-w47ln" (OuterVolumeSpecName: "kube-api-access-w47ln") pod "7d11b837-8c43-45ee-b838-71d13a8b0919" (UID: "7d11b837-8c43-45ee-b838-71d13a8b0919"). InnerVolumeSpecName "kube-api-access-w47ln". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:26:04 crc kubenswrapper[4765]: I0319 11:26:04.732993 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w47ln\" (UniqueName: \"kubernetes.io/projected/7d11b837-8c43-45ee-b838-71d13a8b0919-kube-api-access-w47ln\") on node \"crc\" DevicePath \"\"" Mar 19 11:26:05 crc kubenswrapper[4765]: I0319 11:26:05.104676 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565326-4vw7m" event={"ID":"7d11b837-8c43-45ee-b838-71d13a8b0919","Type":"ContainerDied","Data":"6c715b595698cc91e68ea366d99ea043b9bbc8a7147ecf38d986123062dcad4b"} Mar 19 11:26:05 crc kubenswrapper[4765]: I0319 11:26:05.104720 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c715b595698cc91e68ea366d99ea043b9bbc8a7147ecf38d986123062dcad4b" Mar 19 11:26:05 crc kubenswrapper[4765]: I0319 11:26:05.104758 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565326-4vw7m" Mar 19 11:26:05 crc kubenswrapper[4765]: I0319 11:26:05.546874 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565320-hcmhc"] Mar 19 11:26:05 crc kubenswrapper[4765]: I0319 11:26:05.556342 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565320-hcmhc"] Mar 19 11:26:06 crc kubenswrapper[4765]: I0319 11:26:06.369806 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c1d1a971-9704-4789-a6b2-c6250bab2e4e" path="/var/lib/kubelet/pods/c1d1a971-9704-4789-a6b2-c6250bab2e4e/volumes" Mar 19 11:26:10 crc kubenswrapper[4765]: I0319 11:26:10.355655 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:26:10 crc kubenswrapper[4765]: E0319 11:26:10.356474 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:26:16 crc kubenswrapper[4765]: I0319 11:26:16.273723 4765 scope.go:117] "RemoveContainer" containerID="aa84a26389328563efc4b462881d3f2f6283d9512bcf29f60617d26c9ebc4eaf" Mar 19 11:26:22 crc kubenswrapper[4765]: I0319 11:26:22.363778 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:26:22 crc kubenswrapper[4765]: E0319 11:26:22.364751 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:26:36 crc kubenswrapper[4765]: I0319 11:26:36.357162 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:26:36 crc kubenswrapper[4765]: E0319 11:26:36.358174 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:26:49 crc kubenswrapper[4765]: I0319 11:26:49.356694 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:26:49 crc kubenswrapper[4765]: E0319 11:26:49.357609 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:27:03 crc kubenswrapper[4765]: I0319 11:27:03.356322 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:27:03 crc kubenswrapper[4765]: E0319 11:27:03.357343 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:27:16 crc kubenswrapper[4765]: I0319 11:27:16.355361 4765 scope.go:117] "RemoveContainer" containerID="71f88fe51935548aad177e586e889db157aa7613c6840c8c513f1270754befa2" Mar 19 11:27:17 crc kubenswrapper[4765]: I0319 11:27:17.356271 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:27:17 crc kubenswrapper[4765]: E0319 11:27:17.356949 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:27:31 crc kubenswrapper[4765]: I0319 11:27:31.356135 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:27:31 crc kubenswrapper[4765]: E0319 11:27:31.358557 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:27:42 crc kubenswrapper[4765]: I0319 11:27:42.364856 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:27:42 crc kubenswrapper[4765]: E0319 11:27:42.365698 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:27:56 crc kubenswrapper[4765]: I0319 11:27:56.355974 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:27:56 crc kubenswrapper[4765]: E0319 11:27:56.356733 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.154810 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565328-x8xgn"] Mar 19 11:28:00 crc kubenswrapper[4765]: E0319 11:28:00.155774 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7d11b837-8c43-45ee-b838-71d13a8b0919" containerName="oc" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.155789 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="7d11b837-8c43-45ee-b838-71d13a8b0919" containerName="oc" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.156116 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="7d11b837-8c43-45ee-b838-71d13a8b0919" containerName="oc" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.156896 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565328-x8xgn" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.158771 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.159380 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.159388 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.166884 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565328-x8xgn"] Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.255108 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xbjrg\" (UniqueName: \"kubernetes.io/projected/061de557-4131-4b58-b828-2a8611198635-kube-api-access-xbjrg\") pod \"auto-csr-approver-29565328-x8xgn\" (UID: \"061de557-4131-4b58-b828-2a8611198635\") " pod="openshift-infra/auto-csr-approver-29565328-x8xgn" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.357561 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xbjrg\" (UniqueName: \"kubernetes.io/projected/061de557-4131-4b58-b828-2a8611198635-kube-api-access-xbjrg\") pod \"auto-csr-approver-29565328-x8xgn\" (UID: \"061de557-4131-4b58-b828-2a8611198635\") " pod="openshift-infra/auto-csr-approver-29565328-x8xgn" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.380639 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xbjrg\" (UniqueName: \"kubernetes.io/projected/061de557-4131-4b58-b828-2a8611198635-kube-api-access-xbjrg\") pod \"auto-csr-approver-29565328-x8xgn\" (UID: \"061de557-4131-4b58-b828-2a8611198635\") " pod="openshift-infra/auto-csr-approver-29565328-x8xgn" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.477858 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565328-x8xgn" Mar 19 11:28:00 crc kubenswrapper[4765]: I0319 11:28:00.913902 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565328-x8xgn"] Mar 19 11:28:00 crc kubenswrapper[4765]: W0319 11:28:00.916627 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod061de557_4131_4b58_b828_2a8611198635.slice/crio-f66246fe14c636d057027c5b1929d65fd00a418ce5d823f9f594530df334af1a WatchSource:0}: Error finding container f66246fe14c636d057027c5b1929d65fd00a418ce5d823f9f594530df334af1a: Status 404 returned error can't find the container with id f66246fe14c636d057027c5b1929d65fd00a418ce5d823f9f594530df334af1a Mar 19 11:28:01 crc kubenswrapper[4765]: I0319 11:28:01.143454 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565328-x8xgn" event={"ID":"061de557-4131-4b58-b828-2a8611198635","Type":"ContainerStarted","Data":"f66246fe14c636d057027c5b1929d65fd00a418ce5d823f9f594530df334af1a"} Mar 19 11:28:03 crc kubenswrapper[4765]: I0319 11:28:03.161788 4765 generic.go:334] "Generic (PLEG): container finished" podID="061de557-4131-4b58-b828-2a8611198635" containerID="7054ecdfe178fde890d8cadca1b134e6d33df709e2661f54976db2e9b61d2060" exitCode=0 Mar 19 11:28:03 crc kubenswrapper[4765]: I0319 11:28:03.161898 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565328-x8xgn" event={"ID":"061de557-4131-4b58-b828-2a8611198635","Type":"ContainerDied","Data":"7054ecdfe178fde890d8cadca1b134e6d33df709e2661f54976db2e9b61d2060"} Mar 19 11:28:04 crc kubenswrapper[4765]: I0319 11:28:04.566758 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565328-x8xgn" Mar 19 11:28:04 crc kubenswrapper[4765]: I0319 11:28:04.738017 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xbjrg\" (UniqueName: \"kubernetes.io/projected/061de557-4131-4b58-b828-2a8611198635-kube-api-access-xbjrg\") pod \"061de557-4131-4b58-b828-2a8611198635\" (UID: \"061de557-4131-4b58-b828-2a8611198635\") " Mar 19 11:28:04 crc kubenswrapper[4765]: I0319 11:28:04.743494 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/061de557-4131-4b58-b828-2a8611198635-kube-api-access-xbjrg" (OuterVolumeSpecName: "kube-api-access-xbjrg") pod "061de557-4131-4b58-b828-2a8611198635" (UID: "061de557-4131-4b58-b828-2a8611198635"). InnerVolumeSpecName "kube-api-access-xbjrg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:28:04 crc kubenswrapper[4765]: I0319 11:28:04.840722 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xbjrg\" (UniqueName: \"kubernetes.io/projected/061de557-4131-4b58-b828-2a8611198635-kube-api-access-xbjrg\") on node \"crc\" DevicePath \"\"" Mar 19 11:28:05 crc kubenswrapper[4765]: I0319 11:28:05.187749 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565328-x8xgn" event={"ID":"061de557-4131-4b58-b828-2a8611198635","Type":"ContainerDied","Data":"f66246fe14c636d057027c5b1929d65fd00a418ce5d823f9f594530df334af1a"} Mar 19 11:28:05 crc kubenswrapper[4765]: I0319 11:28:05.187785 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f66246fe14c636d057027c5b1929d65fd00a418ce5d823f9f594530df334af1a" Mar 19 11:28:05 crc kubenswrapper[4765]: I0319 11:28:05.188086 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565328-x8xgn" Mar 19 11:28:05 crc kubenswrapper[4765]: I0319 11:28:05.635625 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565322-d6wnz"] Mar 19 11:28:05 crc kubenswrapper[4765]: I0319 11:28:05.646344 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565322-d6wnz"] Mar 19 11:28:06 crc kubenswrapper[4765]: I0319 11:28:06.366543 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="af1449ee-08fd-46db-a3eb-e136c881b7e0" path="/var/lib/kubelet/pods/af1449ee-08fd-46db-a3eb-e136c881b7e0/volumes" Mar 19 11:28:08 crc kubenswrapper[4765]: I0319 11:28:08.355952 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:28:08 crc kubenswrapper[4765]: E0319 11:28:08.356555 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:28:16 crc kubenswrapper[4765]: I0319 11:28:16.406499 4765 scope.go:117] "RemoveContainer" containerID="f8ee3491268a44a22190e7d95703d22dea1e44dc97d73190c35fd16db729d67e" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.319550 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-2xd55"] Mar 19 11:28:20 crc kubenswrapper[4765]: E0319 11:28:20.321043 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="061de557-4131-4b58-b828-2a8611198635" containerName="oc" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.321134 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="061de557-4131-4b58-b828-2a8611198635" containerName="oc" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.321370 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="061de557-4131-4b58-b828-2a8611198635" containerName="oc" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.323058 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.340210 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2xd55"] Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.359675 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:28:20 crc kubenswrapper[4765]: E0319 11:28:20.372771 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.487740 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zr2xw\" (UniqueName: \"kubernetes.io/projected/3257026c-ae61-40c4-817f-55142e65ee0e-kube-api-access-zr2xw\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.488151 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-utilities\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.488468 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-catalog-content\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.590527 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-utilities\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.590758 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-catalog-content\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.590841 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zr2xw\" (UniqueName: \"kubernetes.io/projected/3257026c-ae61-40c4-817f-55142e65ee0e-kube-api-access-zr2xw\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.591441 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-catalog-content\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.591793 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-utilities\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.617066 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zr2xw\" (UniqueName: \"kubernetes.io/projected/3257026c-ae61-40c4-817f-55142e65ee0e-kube-api-access-zr2xw\") pod \"certified-operators-2xd55\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:20 crc kubenswrapper[4765]: I0319 11:28:20.659429 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:21 crc kubenswrapper[4765]: I0319 11:28:21.183612 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-2xd55"] Mar 19 11:28:21 crc kubenswrapper[4765]: I0319 11:28:21.353822 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xd55" event={"ID":"3257026c-ae61-40c4-817f-55142e65ee0e","Type":"ContainerStarted","Data":"0b83aec6caab0c53a4d6d6cf455dc3dcc3a4e51c5eaffadc6cca3a48b5c2f817"} Mar 19 11:28:22 crc kubenswrapper[4765]: I0319 11:28:22.371497 4765 generic.go:334] "Generic (PLEG): container finished" podID="3257026c-ae61-40c4-817f-55142e65ee0e" containerID="554e0e8e13b3d80377d39f715c72b7af5d3ed392b339e35288c37f2e6bbd5b71" exitCode=0 Mar 19 11:28:22 crc kubenswrapper[4765]: I0319 11:28:22.372242 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xd55" event={"ID":"3257026c-ae61-40c4-817f-55142e65ee0e","Type":"ContainerDied","Data":"554e0e8e13b3d80377d39f715c72b7af5d3ed392b339e35288c37f2e6bbd5b71"} Mar 19 11:28:23 crc kubenswrapper[4765]: I0319 11:28:23.388171 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xd55" event={"ID":"3257026c-ae61-40c4-817f-55142e65ee0e","Type":"ContainerStarted","Data":"d60e2e495bafd6b555b942d8051a71e4dd44f04a357bc95943ccf19e9b903a61"} Mar 19 11:28:25 crc kubenswrapper[4765]: I0319 11:28:25.413432 4765 generic.go:334] "Generic (PLEG): container finished" podID="3257026c-ae61-40c4-817f-55142e65ee0e" containerID="d60e2e495bafd6b555b942d8051a71e4dd44f04a357bc95943ccf19e9b903a61" exitCode=0 Mar 19 11:28:25 crc kubenswrapper[4765]: I0319 11:28:25.413488 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xd55" event={"ID":"3257026c-ae61-40c4-817f-55142e65ee0e","Type":"ContainerDied","Data":"d60e2e495bafd6b555b942d8051a71e4dd44f04a357bc95943ccf19e9b903a61"} Mar 19 11:28:26 crc kubenswrapper[4765]: I0319 11:28:26.429500 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xd55" event={"ID":"3257026c-ae61-40c4-817f-55142e65ee0e","Type":"ContainerStarted","Data":"4e1ef49555a188440b86916ebe45f38a8db23036805315d00883570c514f67a6"} Mar 19 11:28:26 crc kubenswrapper[4765]: I0319 11:28:26.454609 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-2xd55" podStartSLOduration=2.9924541810000003 podStartE2EDuration="6.454588075s" podCreationTimestamp="2026-03-19 11:28:20 +0000 UTC" firstStartedPulling="2026-03-19 11:28:22.377334659 +0000 UTC m=+4000.726280201" lastFinishedPulling="2026-03-19 11:28:25.839468553 +0000 UTC m=+4004.188414095" observedRunningTime="2026-03-19 11:28:26.451314766 +0000 UTC m=+4004.800260308" watchObservedRunningTime="2026-03-19 11:28:26.454588075 +0000 UTC m=+4004.803533617" Mar 19 11:28:30 crc kubenswrapper[4765]: I0319 11:28:30.659638 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:30 crc kubenswrapper[4765]: I0319 11:28:30.661420 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:30 crc kubenswrapper[4765]: I0319 11:28:30.754913 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:31 crc kubenswrapper[4765]: I0319 11:28:31.625572 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:31 crc kubenswrapper[4765]: I0319 11:28:31.694762 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2xd55"] Mar 19 11:28:32 crc kubenswrapper[4765]: I0319 11:28:32.367167 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:28:32 crc kubenswrapper[4765]: E0319 11:28:32.368178 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:28:33 crc kubenswrapper[4765]: I0319 11:28:33.597327 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-2xd55" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="registry-server" containerID="cri-o://4e1ef49555a188440b86916ebe45f38a8db23036805315d00883570c514f67a6" gracePeriod=2 Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.547133 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fkmjb/must-gather-4rn4d"] Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.555820 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.560146 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-must-gather-fkmjb"/"default-dockercfg-kzc9f" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.560346 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fkmjb"/"openshift-service-ca.crt" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.560638 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fkmjb/must-gather-4rn4d"] Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.562400 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-fkmjb"/"kube-root-ca.crt" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.634881 4765 generic.go:334] "Generic (PLEG): container finished" podID="3257026c-ae61-40c4-817f-55142e65ee0e" containerID="4e1ef49555a188440b86916ebe45f38a8db23036805315d00883570c514f67a6" exitCode=0 Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.634932 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xd55" event={"ID":"3257026c-ae61-40c4-817f-55142e65ee0e","Type":"ContainerDied","Data":"4e1ef49555a188440b86916ebe45f38a8db23036805315d00883570c514f67a6"} Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.719668 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbbhj\" (UniqueName: \"kubernetes.io/projected/9acef296-5e79-49cf-867a-124137b68d69-kube-api-access-tbbhj\") pod \"must-gather-4rn4d\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.719730 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9acef296-5e79-49cf-867a-124137b68d69-must-gather-output\") pod \"must-gather-4rn4d\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.825401 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbbhj\" (UniqueName: \"kubernetes.io/projected/9acef296-5e79-49cf-867a-124137b68d69-kube-api-access-tbbhj\") pod \"must-gather-4rn4d\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.825828 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9acef296-5e79-49cf-867a-124137b68d69-must-gather-output\") pod \"must-gather-4rn4d\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.826406 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9acef296-5e79-49cf-867a-124137b68d69-must-gather-output\") pod \"must-gather-4rn4d\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.865975 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbbhj\" (UniqueName: \"kubernetes.io/projected/9acef296-5e79-49cf-867a-124137b68d69-kube-api-access-tbbhj\") pod \"must-gather-4rn4d\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:34 crc kubenswrapper[4765]: I0319 11:28:34.881417 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.018270 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.131867 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-catalog-content\") pod \"3257026c-ae61-40c4-817f-55142e65ee0e\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.132116 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-utilities\") pod \"3257026c-ae61-40c4-817f-55142e65ee0e\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.132229 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zr2xw\" (UniqueName: \"kubernetes.io/projected/3257026c-ae61-40c4-817f-55142e65ee0e-kube-api-access-zr2xw\") pod \"3257026c-ae61-40c4-817f-55142e65ee0e\" (UID: \"3257026c-ae61-40c4-817f-55142e65ee0e\") " Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.133500 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-utilities" (OuterVolumeSpecName: "utilities") pod "3257026c-ae61-40c4-817f-55142e65ee0e" (UID: "3257026c-ae61-40c4-817f-55142e65ee0e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.141295 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3257026c-ae61-40c4-817f-55142e65ee0e-kube-api-access-zr2xw" (OuterVolumeSpecName: "kube-api-access-zr2xw") pod "3257026c-ae61-40c4-817f-55142e65ee0e" (UID: "3257026c-ae61-40c4-817f-55142e65ee0e"). InnerVolumeSpecName "kube-api-access-zr2xw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.220845 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3257026c-ae61-40c4-817f-55142e65ee0e" (UID: "3257026c-ae61-40c4-817f-55142e65ee0e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.234340 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zr2xw\" (UniqueName: \"kubernetes.io/projected/3257026c-ae61-40c4-817f-55142e65ee0e-kube-api-access-zr2xw\") on node \"crc\" DevicePath \"\"" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.234385 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.234398 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3257026c-ae61-40c4-817f-55142e65ee0e-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.429157 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-fkmjb/must-gather-4rn4d"] Mar 19 11:28:35 crc kubenswrapper[4765]: W0319 11:28:35.439704 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9acef296_5e79_49cf_867a_124137b68d69.slice/crio-1c35f9f13db8a3dff917bf1d3aedec890381949bb51ee5dc759b222232f87ec2 WatchSource:0}: Error finding container 1c35f9f13db8a3dff917bf1d3aedec890381949bb51ee5dc759b222232f87ec2: Status 404 returned error can't find the container with id 1c35f9f13db8a3dff917bf1d3aedec890381949bb51ee5dc759b222232f87ec2 Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.648607 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-2xd55" event={"ID":"3257026c-ae61-40c4-817f-55142e65ee0e","Type":"ContainerDied","Data":"0b83aec6caab0c53a4d6d6cf455dc3dcc3a4e51c5eaffadc6cca3a48b5c2f817"} Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.648645 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-2xd55" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.648669 4765 scope.go:117] "RemoveContainer" containerID="4e1ef49555a188440b86916ebe45f38a8db23036805315d00883570c514f67a6" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.649891 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" event={"ID":"9acef296-5e79-49cf-867a-124137b68d69","Type":"ContainerStarted","Data":"1c35f9f13db8a3dff917bf1d3aedec890381949bb51ee5dc759b222232f87ec2"} Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.677287 4765 scope.go:117] "RemoveContainer" containerID="d60e2e495bafd6b555b942d8051a71e4dd44f04a357bc95943ccf19e9b903a61" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.729861 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-2xd55"] Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.735529 4765 scope.go:117] "RemoveContainer" containerID="554e0e8e13b3d80377d39f715c72b7af5d3ed392b339e35288c37f2e6bbd5b71" Mar 19 11:28:35 crc kubenswrapper[4765]: I0319 11:28:35.749559 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-2xd55"] Mar 19 11:28:36 crc kubenswrapper[4765]: I0319 11:28:36.369350 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" path="/var/lib/kubelet/pods/3257026c-ae61-40c4-817f-55142e65ee0e/volumes" Mar 19 11:28:36 crc kubenswrapper[4765]: I0319 11:28:36.663474 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" event={"ID":"9acef296-5e79-49cf-867a-124137b68d69","Type":"ContainerStarted","Data":"a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897"} Mar 19 11:28:36 crc kubenswrapper[4765]: I0319 11:28:36.663523 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" event={"ID":"9acef296-5e79-49cf-867a-124137b68d69","Type":"ContainerStarted","Data":"c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba"} Mar 19 11:28:36 crc kubenswrapper[4765]: I0319 11:28:36.688030 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" podStartSLOduration=2.688006401 podStartE2EDuration="2.688006401s" podCreationTimestamp="2026-03-19 11:28:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:28:36.677107508 +0000 UTC m=+4015.026053040" watchObservedRunningTime="2026-03-19 11:28:36.688006401 +0000 UTC m=+4015.036951963" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.056370 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-b8bpr"] Mar 19 11:28:40 crc kubenswrapper[4765]: E0319 11:28:40.058343 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="registry-server" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.058437 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="registry-server" Mar 19 11:28:40 crc kubenswrapper[4765]: E0319 11:28:40.058518 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="extract-utilities" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.058589 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="extract-utilities" Mar 19 11:28:40 crc kubenswrapper[4765]: E0319 11:28:40.058686 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="extract-content" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.058742 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="extract-content" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.059085 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3257026c-ae61-40c4-817f-55142e65ee0e" containerName="registry-server" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.059817 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.147352 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-host\") pod \"crc-debug-b8bpr\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.147601 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czsrw\" (UniqueName: \"kubernetes.io/projected/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-kube-api-access-czsrw\") pod \"crc-debug-b8bpr\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.249324 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-host\") pod \"crc-debug-b8bpr\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.249857 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czsrw\" (UniqueName: \"kubernetes.io/projected/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-kube-api-access-czsrw\") pod \"crc-debug-b8bpr\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.249559 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-host\") pod \"crc-debug-b8bpr\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.277805 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czsrw\" (UniqueName: \"kubernetes.io/projected/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-kube-api-access-czsrw\") pod \"crc-debug-b8bpr\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.381909 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:28:40 crc kubenswrapper[4765]: W0319 11:28:40.410999 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0d7efd74_f453_42eb_b9ca_6d8a47d8e1a6.slice/crio-4059ed6dc109d11c8b025b0c3a585b78de82ff5e5f6648e62232d16793287871 WatchSource:0}: Error finding container 4059ed6dc109d11c8b025b0c3a585b78de82ff5e5f6648e62232d16793287871: Status 404 returned error can't find the container with id 4059ed6dc109d11c8b025b0c3a585b78de82ff5e5f6648e62232d16793287871 Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.719462 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" event={"ID":"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6","Type":"ContainerStarted","Data":"a3a9d03d29f799d7ecc45339250f16ed851184c46eebbcd8f2f936214ffcd765"} Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.719811 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" event={"ID":"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6","Type":"ContainerStarted","Data":"4059ed6dc109d11c8b025b0c3a585b78de82ff5e5f6648e62232d16793287871"} Mar 19 11:28:40 crc kubenswrapper[4765]: I0319 11:28:40.745840 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" podStartSLOduration=0.745819072 podStartE2EDuration="745.819072ms" podCreationTimestamp="2026-03-19 11:28:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-19 11:28:40.736350307 +0000 UTC m=+4019.085295869" watchObservedRunningTime="2026-03-19 11:28:40.745819072 +0000 UTC m=+4019.094764614" Mar 19 11:28:45 crc kubenswrapper[4765]: I0319 11:28:45.356890 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:28:45 crc kubenswrapper[4765]: E0319 11:28:45.358029 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:28:59 crc kubenswrapper[4765]: I0319 11:28:59.356583 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:28:59 crc kubenswrapper[4765]: E0319 11:28:59.357264 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:29:02 crc kubenswrapper[4765]: I0319 11:29:02.951482 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-ps4g2"] Mar 19 11:29:02 crc kubenswrapper[4765]: I0319 11:29:02.953819 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.005082 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps4g2"] Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.032592 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vq7cz\" (UniqueName: \"kubernetes.io/projected/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-kube-api-access-vq7cz\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.032703 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-utilities\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.033065 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-catalog-content\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.136851 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-catalog-content\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.137676 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vq7cz\" (UniqueName: \"kubernetes.io/projected/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-kube-api-access-vq7cz\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.137736 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-utilities\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.138250 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-utilities\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.138641 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-catalog-content\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.165489 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bxhsm"] Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.167680 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.176171 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vq7cz\" (UniqueName: \"kubernetes.io/projected/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-kube-api-access-vq7cz\") pod \"community-operators-ps4g2\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.182224 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bxhsm"] Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.240456 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d78ck\" (UniqueName: \"kubernetes.io/projected/c54cfe34-1d42-488d-b462-cdb622dfcfc8-kube-api-access-d78ck\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.240632 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-catalog-content\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.240750 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-utilities\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.273659 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.341941 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d78ck\" (UniqueName: \"kubernetes.io/projected/c54cfe34-1d42-488d-b462-cdb622dfcfc8-kube-api-access-d78ck\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.342002 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-catalog-content\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.342079 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-utilities\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.342524 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-utilities\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.343442 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-catalog-content\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.365768 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d78ck\" (UniqueName: \"kubernetes.io/projected/c54cfe34-1d42-488d-b462-cdb622dfcfc8-kube-api-access-d78ck\") pod \"redhat-operators-bxhsm\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.565493 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:03 crc kubenswrapper[4765]: I0319 11:29:03.910560 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-ps4g2"] Mar 19 11:29:04 crc kubenswrapper[4765]: I0319 11:29:04.214405 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bxhsm"] Mar 19 11:29:04 crc kubenswrapper[4765]: W0319 11:29:04.222837 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc54cfe34_1d42_488d_b462_cdb622dfcfc8.slice/crio-39f5a85f527dc3ca5f9697be01f798b0dafa6e843489f609599b41032e1a80ed WatchSource:0}: Error finding container 39f5a85f527dc3ca5f9697be01f798b0dafa6e843489f609599b41032e1a80ed: Status 404 returned error can't find the container with id 39f5a85f527dc3ca5f9697be01f798b0dafa6e843489f609599b41032e1a80ed Mar 19 11:29:04 crc kubenswrapper[4765]: I0319 11:29:04.928661 4765 generic.go:334] "Generic (PLEG): container finished" podID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerID="355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521" exitCode=0 Mar 19 11:29:04 crc kubenswrapper[4765]: I0319 11:29:04.928950 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxhsm" event={"ID":"c54cfe34-1d42-488d-b462-cdb622dfcfc8","Type":"ContainerDied","Data":"355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521"} Mar 19 11:29:04 crc kubenswrapper[4765]: I0319 11:29:04.929027 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxhsm" event={"ID":"c54cfe34-1d42-488d-b462-cdb622dfcfc8","Type":"ContainerStarted","Data":"39f5a85f527dc3ca5f9697be01f798b0dafa6e843489f609599b41032e1a80ed"} Mar 19 11:29:04 crc kubenswrapper[4765]: I0319 11:29:04.932172 4765 generic.go:334] "Generic (PLEG): container finished" podID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerID="9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1" exitCode=0 Mar 19 11:29:04 crc kubenswrapper[4765]: I0319 11:29:04.932208 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps4g2" event={"ID":"d00ae6de-8c56-447b-806c-bf1dfe25cfa8","Type":"ContainerDied","Data":"9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1"} Mar 19 11:29:04 crc kubenswrapper[4765]: I0319 11:29:04.932232 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps4g2" event={"ID":"d00ae6de-8c56-447b-806c-bf1dfe25cfa8","Type":"ContainerStarted","Data":"cef586ce4be097aa1cc036165d8c67100e88cb7ec6adcf870df106ead930ba1a"} Mar 19 11:29:05 crc kubenswrapper[4765]: I0319 11:29:05.941221 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxhsm" event={"ID":"c54cfe34-1d42-488d-b462-cdb622dfcfc8","Type":"ContainerStarted","Data":"9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886"} Mar 19 11:29:06 crc kubenswrapper[4765]: E0319 11:29:06.492493 4765 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd00ae6de_8c56_447b_806c_bf1dfe25cfa8.slice/crio-conmon-33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd00ae6de_8c56_447b_806c_bf1dfe25cfa8.slice/crio-33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d.scope\": RecentStats: unable to find data in memory cache]" Mar 19 11:29:06 crc kubenswrapper[4765]: I0319 11:29:06.952908 4765 generic.go:334] "Generic (PLEG): container finished" podID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerID="9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886" exitCode=0 Mar 19 11:29:06 crc kubenswrapper[4765]: I0319 11:29:06.953005 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxhsm" event={"ID":"c54cfe34-1d42-488d-b462-cdb622dfcfc8","Type":"ContainerDied","Data":"9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886"} Mar 19 11:29:06 crc kubenswrapper[4765]: I0319 11:29:06.955606 4765 generic.go:334] "Generic (PLEG): container finished" podID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerID="33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d" exitCode=0 Mar 19 11:29:06 crc kubenswrapper[4765]: I0319 11:29:06.955646 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps4g2" event={"ID":"d00ae6de-8c56-447b-806c-bf1dfe25cfa8","Type":"ContainerDied","Data":"33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d"} Mar 19 11:29:07 crc kubenswrapper[4765]: I0319 11:29:07.966962 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxhsm" event={"ID":"c54cfe34-1d42-488d-b462-cdb622dfcfc8","Type":"ContainerStarted","Data":"2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da"} Mar 19 11:29:07 crc kubenswrapper[4765]: I0319 11:29:07.970103 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps4g2" event={"ID":"d00ae6de-8c56-447b-806c-bf1dfe25cfa8","Type":"ContainerStarted","Data":"ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760"} Mar 19 11:29:07 crc kubenswrapper[4765]: I0319 11:29:07.994511 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bxhsm" podStartSLOduration=2.537767733 podStartE2EDuration="4.994489224s" podCreationTimestamp="2026-03-19 11:29:03 +0000 UTC" firstStartedPulling="2026-03-19 11:29:04.930992286 +0000 UTC m=+4043.279937828" lastFinishedPulling="2026-03-19 11:29:07.387713767 +0000 UTC m=+4045.736659319" observedRunningTime="2026-03-19 11:29:07.984951586 +0000 UTC m=+4046.333897128" watchObservedRunningTime="2026-03-19 11:29:07.994489224 +0000 UTC m=+4046.343434776" Mar 19 11:29:08 crc kubenswrapper[4765]: I0319 11:29:08.014363 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-ps4g2" podStartSLOduration=3.536772975 podStartE2EDuration="6.014337339s" podCreationTimestamp="2026-03-19 11:29:02 +0000 UTC" firstStartedPulling="2026-03-19 11:29:04.934606433 +0000 UTC m=+4043.283551975" lastFinishedPulling="2026-03-19 11:29:07.412170797 +0000 UTC m=+4045.761116339" observedRunningTime="2026-03-19 11:29:08.006753904 +0000 UTC m=+4046.355699456" watchObservedRunningTime="2026-03-19 11:29:08.014337339 +0000 UTC m=+4046.363282881" Mar 19 11:29:13 crc kubenswrapper[4765]: I0319 11:29:13.273839 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:13 crc kubenswrapper[4765]: I0319 11:29:13.274467 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:13 crc kubenswrapper[4765]: I0319 11:29:13.327659 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:13 crc kubenswrapper[4765]: I0319 11:29:13.356106 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:29:13 crc kubenswrapper[4765]: E0319 11:29:13.356370 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:29:13 crc kubenswrapper[4765]: I0319 11:29:13.567027 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:13 crc kubenswrapper[4765]: I0319 11:29:13.567074 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:14 crc kubenswrapper[4765]: I0319 11:29:14.082833 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:14 crc kubenswrapper[4765]: I0319 11:29:14.147053 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps4g2"] Mar 19 11:29:14 crc kubenswrapper[4765]: I0319 11:29:14.621010 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bxhsm" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="registry-server" probeResult="failure" output=< Mar 19 11:29:14 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 11:29:14 crc kubenswrapper[4765]: > Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.049172 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-ps4g2" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="registry-server" containerID="cri-o://ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760" gracePeriod=2 Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.583024 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.602287 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-catalog-content\") pod \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.602555 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-utilities\") pod \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.602665 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vq7cz\" (UniqueName: \"kubernetes.io/projected/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-kube-api-access-vq7cz\") pod \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\" (UID: \"d00ae6de-8c56-447b-806c-bf1dfe25cfa8\") " Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.603368 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-utilities" (OuterVolumeSpecName: "utilities") pod "d00ae6de-8c56-447b-806c-bf1dfe25cfa8" (UID: "d00ae6de-8c56-447b-806c-bf1dfe25cfa8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.607180 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.613226 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-kube-api-access-vq7cz" (OuterVolumeSpecName: "kube-api-access-vq7cz") pod "d00ae6de-8c56-447b-806c-bf1dfe25cfa8" (UID: "d00ae6de-8c56-447b-806c-bf1dfe25cfa8"). InnerVolumeSpecName "kube-api-access-vq7cz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.686309 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d00ae6de-8c56-447b-806c-bf1dfe25cfa8" (UID: "d00ae6de-8c56-447b-806c-bf1dfe25cfa8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.710395 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vq7cz\" (UniqueName: \"kubernetes.io/projected/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-kube-api-access-vq7cz\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:16 crc kubenswrapper[4765]: I0319 11:29:16.710434 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d00ae6de-8c56-447b-806c-bf1dfe25cfa8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.060410 4765 generic.go:334] "Generic (PLEG): container finished" podID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerID="ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760" exitCode=0 Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.060574 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps4g2" event={"ID":"d00ae6de-8c56-447b-806c-bf1dfe25cfa8","Type":"ContainerDied","Data":"ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760"} Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.061699 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-ps4g2" event={"ID":"d00ae6de-8c56-447b-806c-bf1dfe25cfa8","Type":"ContainerDied","Data":"cef586ce4be097aa1cc036165d8c67100e88cb7ec6adcf870df106ead930ba1a"} Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.061800 4765 scope.go:117] "RemoveContainer" containerID="ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.060721 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-ps4g2" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.089675 4765 scope.go:117] "RemoveContainer" containerID="33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.100905 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-ps4g2"] Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.114453 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-ps4g2"] Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.128147 4765 scope.go:117] "RemoveContainer" containerID="9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.163741 4765 scope.go:117] "RemoveContainer" containerID="ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760" Mar 19 11:29:17 crc kubenswrapper[4765]: E0319 11:29:17.164576 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760\": container with ID starting with ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760 not found: ID does not exist" containerID="ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.164627 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760"} err="failed to get container status \"ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760\": rpc error: code = NotFound desc = could not find container \"ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760\": container with ID starting with ee143574aef0448cbe68f7de0e04ddd3ae01c84cdf2dee8ee6b04a316e8ba760 not found: ID does not exist" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.164664 4765 scope.go:117] "RemoveContainer" containerID="33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d" Mar 19 11:29:17 crc kubenswrapper[4765]: E0319 11:29:17.165182 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d\": container with ID starting with 33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d not found: ID does not exist" containerID="33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.165229 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d"} err="failed to get container status \"33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d\": rpc error: code = NotFound desc = could not find container \"33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d\": container with ID starting with 33afa67bf5eaa6d6a94bf6c247a14cad0c8983aabcb82c6725159b314d0e833d not found: ID does not exist" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.165252 4765 scope.go:117] "RemoveContainer" containerID="9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1" Mar 19 11:29:17 crc kubenswrapper[4765]: E0319 11:29:17.165533 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1\": container with ID starting with 9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1 not found: ID does not exist" containerID="9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1" Mar 19 11:29:17 crc kubenswrapper[4765]: I0319 11:29:17.165559 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1"} err="failed to get container status \"9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1\": rpc error: code = NotFound desc = could not find container \"9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1\": container with ID starting with 9cc03cbb86e1c0d380670fc7d6db5fff8cb63a911ca77d89a237a14f367639a1 not found: ID does not exist" Mar 19 11:29:18 crc kubenswrapper[4765]: I0319 11:29:18.368272 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" path="/var/lib/kubelet/pods/d00ae6de-8c56-447b-806c-bf1dfe25cfa8/volumes" Mar 19 11:29:19 crc kubenswrapper[4765]: I0319 11:29:19.079599 4765 generic.go:334] "Generic (PLEG): container finished" podID="0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6" containerID="a3a9d03d29f799d7ecc45339250f16ed851184c46eebbcd8f2f936214ffcd765" exitCode=0 Mar 19 11:29:19 crc kubenswrapper[4765]: I0319 11:29:19.079646 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" event={"ID":"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6","Type":"ContainerDied","Data":"a3a9d03d29f799d7ecc45339250f16ed851184c46eebbcd8f2f936214ffcd765"} Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.186514 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.232899 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-b8bpr"] Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.242545 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-b8bpr"] Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.278172 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-host\") pod \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.278268 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czsrw\" (UniqueName: \"kubernetes.io/projected/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-kube-api-access-czsrw\") pod \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\" (UID: \"0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6\") " Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.278414 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-host" (OuterVolumeSpecName: "host") pod "0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6" (UID: "0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.278796 4765 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-host\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.285740 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-kube-api-access-czsrw" (OuterVolumeSpecName: "kube-api-access-czsrw") pod "0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6" (UID: "0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6"). InnerVolumeSpecName "kube-api-access-czsrw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.368795 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6" path="/var/lib/kubelet/pods/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6/volumes" Mar 19 11:29:20 crc kubenswrapper[4765]: I0319 11:29:20.380178 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czsrw\" (UniqueName: \"kubernetes.io/projected/0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6-kube-api-access-czsrw\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.100544 4765 scope.go:117] "RemoveContainer" containerID="a3a9d03d29f799d7ecc45339250f16ed851184c46eebbcd8f2f936214ffcd765" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.100644 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-b8bpr" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.607558 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-xxj7h"] Mar 19 11:29:21 crc kubenswrapper[4765]: E0319 11:29:21.608354 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="extract-utilities" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.608372 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="extract-utilities" Mar 19 11:29:21 crc kubenswrapper[4765]: E0319 11:29:21.608388 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6" containerName="container-00" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.608394 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6" containerName="container-00" Mar 19 11:29:21 crc kubenswrapper[4765]: E0319 11:29:21.608425 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="registry-server" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.608431 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="registry-server" Mar 19 11:29:21 crc kubenswrapper[4765]: E0319 11:29:21.608438 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="extract-content" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.608445 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="extract-content" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.610568 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d00ae6de-8c56-447b-806c-bf1dfe25cfa8" containerName="registry-server" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.610614 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="0d7efd74-f453-42eb-b9ca-6d8a47d8e1a6" containerName="container-00" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.611413 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.702451 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gxss4\" (UniqueName: \"kubernetes.io/projected/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-kube-api-access-gxss4\") pod \"crc-debug-xxj7h\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.702618 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-host\") pod \"crc-debug-xxj7h\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.804845 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gxss4\" (UniqueName: \"kubernetes.io/projected/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-kube-api-access-gxss4\") pod \"crc-debug-xxj7h\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.805054 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-host\") pod \"crc-debug-xxj7h\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.805234 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-host\") pod \"crc-debug-xxj7h\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.830677 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gxss4\" (UniqueName: \"kubernetes.io/projected/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-kube-api-access-gxss4\") pod \"crc-debug-xxj7h\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: I0319 11:29:21.929412 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:21 crc kubenswrapper[4765]: W0319 11:29:21.976667 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3e2ab4bd_7fd9_4609_b39b_ca6baceb5d83.slice/crio-29cbc7f79bbc41e35d026ec367d1e7411766c4cc8718a459d3bc9e44df56fc76 WatchSource:0}: Error finding container 29cbc7f79bbc41e35d026ec367d1e7411766c4cc8718a459d3bc9e44df56fc76: Status 404 returned error can't find the container with id 29cbc7f79bbc41e35d026ec367d1e7411766c4cc8718a459d3bc9e44df56fc76 Mar 19 11:29:22 crc kubenswrapper[4765]: I0319 11:29:22.164152 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" event={"ID":"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83","Type":"ContainerStarted","Data":"29cbc7f79bbc41e35d026ec367d1e7411766c4cc8718a459d3bc9e44df56fc76"} Mar 19 11:29:23 crc kubenswrapper[4765]: I0319 11:29:23.180264 4765 generic.go:334] "Generic (PLEG): container finished" podID="3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83" containerID="c1b77427b0638d8ca5d6766b8430c46eb28e02569d007d4577e3170aedc303ba" exitCode=0 Mar 19 11:29:23 crc kubenswrapper[4765]: I0319 11:29:23.180645 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" event={"ID":"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83","Type":"ContainerDied","Data":"c1b77427b0638d8ca5d6766b8430c46eb28e02569d007d4577e3170aedc303ba"} Mar 19 11:29:23 crc kubenswrapper[4765]: I0319 11:29:23.588997 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-xxj7h"] Mar 19 11:29:23 crc kubenswrapper[4765]: I0319 11:29:23.599272 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-xxj7h"] Mar 19 11:29:23 crc kubenswrapper[4765]: I0319 11:29:23.617251 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:23 crc kubenswrapper[4765]: I0319 11:29:23.661685 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:23 crc kubenswrapper[4765]: I0319 11:29:23.856377 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bxhsm"] Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.281199 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.351827 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-host\") pod \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.352109 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gxss4\" (UniqueName: \"kubernetes.io/projected/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-kube-api-access-gxss4\") pod \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\" (UID: \"3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83\") " Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.353235 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-host" (OuterVolumeSpecName: "host") pod "3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83" (UID: "3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.362286 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-kube-api-access-gxss4" (OuterVolumeSpecName: "kube-api-access-gxss4") pod "3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83" (UID: "3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83"). InnerVolumeSpecName "kube-api-access-gxss4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.367703 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83" path="/var/lib/kubelet/pods/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83/volumes" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.454544 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gxss4\" (UniqueName: \"kubernetes.io/projected/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-kube-api-access-gxss4\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.454839 4765 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83-host\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.787634 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-rxvqj"] Mar 19 11:29:24 crc kubenswrapper[4765]: E0319 11:29:24.788110 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83" containerName="container-00" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.788130 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83" containerName="container-00" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.788295 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="3e2ab4bd-7fd9-4609-b39b-ca6baceb5d83" containerName="container-00" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.788895 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.964230 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-host\") pod \"crc-debug-rxvqj\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:24 crc kubenswrapper[4765]: I0319 11:29:24.964334 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kdk2\" (UniqueName: \"kubernetes.io/projected/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-kube-api-access-9kdk2\") pod \"crc-debug-rxvqj\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.066821 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9kdk2\" (UniqueName: \"kubernetes.io/projected/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-kube-api-access-9kdk2\") pod \"crc-debug-rxvqj\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.067047 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-host\") pod \"crc-debug-rxvqj\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.067217 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-host\") pod \"crc-debug-rxvqj\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.082831 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kdk2\" (UniqueName: \"kubernetes.io/projected/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-kube-api-access-9kdk2\") pod \"crc-debug-rxvqj\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.106216 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:25 crc kubenswrapper[4765]: W0319 11:29:25.138438 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podadedb4d2_1fa8_4a06_ab3f_1ee238793e11.slice/crio-313a8ea484d067ccabc7c7da21b1aff4b15de6b3cfc8e4a57fd83f4ac4fd2ff0 WatchSource:0}: Error finding container 313a8ea484d067ccabc7c7da21b1aff4b15de6b3cfc8e4a57fd83f4ac4fd2ff0: Status 404 returned error can't find the container with id 313a8ea484d067ccabc7c7da21b1aff4b15de6b3cfc8e4a57fd83f4ac4fd2ff0 Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.200321 4765 scope.go:117] "RemoveContainer" containerID="c1b77427b0638d8ca5d6766b8430c46eb28e02569d007d4577e3170aedc303ba" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.201028 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-xxj7h" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.204071 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" event={"ID":"adedb4d2-1fa8-4a06-ab3f-1ee238793e11","Type":"ContainerStarted","Data":"313a8ea484d067ccabc7c7da21b1aff4b15de6b3cfc8e4a57fd83f4ac4fd2ff0"} Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.204166 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bxhsm" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="registry-server" containerID="cri-o://2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da" gracePeriod=2 Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.752515 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.780268 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-utilities\") pod \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.780333 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d78ck\" (UniqueName: \"kubernetes.io/projected/c54cfe34-1d42-488d-b462-cdb622dfcfc8-kube-api-access-d78ck\") pod \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.780416 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-catalog-content\") pod \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\" (UID: \"c54cfe34-1d42-488d-b462-cdb622dfcfc8\") " Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.784711 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-utilities" (OuterVolumeSpecName: "utilities") pod "c54cfe34-1d42-488d-b462-cdb622dfcfc8" (UID: "c54cfe34-1d42-488d-b462-cdb622dfcfc8"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.796231 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c54cfe34-1d42-488d-b462-cdb622dfcfc8-kube-api-access-d78ck" (OuterVolumeSpecName: "kube-api-access-d78ck") pod "c54cfe34-1d42-488d-b462-cdb622dfcfc8" (UID: "c54cfe34-1d42-488d-b462-cdb622dfcfc8"). InnerVolumeSpecName "kube-api-access-d78ck". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.882324 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.882360 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d78ck\" (UniqueName: \"kubernetes.io/projected/c54cfe34-1d42-488d-b462-cdb622dfcfc8-kube-api-access-d78ck\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.938262 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c54cfe34-1d42-488d-b462-cdb622dfcfc8" (UID: "c54cfe34-1d42-488d-b462-cdb622dfcfc8"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:29:25 crc kubenswrapper[4765]: I0319 11:29:25.983739 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c54cfe34-1d42-488d-b462-cdb622dfcfc8-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.213452 4765 generic.go:334] "Generic (PLEG): container finished" podID="adedb4d2-1fa8-4a06-ab3f-1ee238793e11" containerID="c93dae9f77fafddeb887cfbe9b12f286ce4c62b1b2a5e39fdc0acc0ff771984f" exitCode=0 Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.213532 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" event={"ID":"adedb4d2-1fa8-4a06-ab3f-1ee238793e11","Type":"ContainerDied","Data":"c93dae9f77fafddeb887cfbe9b12f286ce4c62b1b2a5e39fdc0acc0ff771984f"} Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.221086 4765 generic.go:334] "Generic (PLEG): container finished" podID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerID="2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da" exitCode=0 Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.221148 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxhsm" event={"ID":"c54cfe34-1d42-488d-b462-cdb622dfcfc8","Type":"ContainerDied","Data":"2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da"} Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.221176 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bxhsm" event={"ID":"c54cfe34-1d42-488d-b462-cdb622dfcfc8","Type":"ContainerDied","Data":"39f5a85f527dc3ca5f9697be01f798b0dafa6e843489f609599b41032e1a80ed"} Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.221199 4765 scope.go:117] "RemoveContainer" containerID="2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.221268 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bxhsm" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.259766 4765 scope.go:117] "RemoveContainer" containerID="9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.269461 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bxhsm"] Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.283759 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bxhsm"] Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.286063 4765 scope.go:117] "RemoveContainer" containerID="355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.297052 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-rxvqj"] Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.304625 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fkmjb/crc-debug-rxvqj"] Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.337444 4765 scope.go:117] "RemoveContainer" containerID="2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da" Mar 19 11:29:26 crc kubenswrapper[4765]: E0319 11:29:26.337979 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da\": container with ID starting with 2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da not found: ID does not exist" containerID="2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.338038 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da"} err="failed to get container status \"2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da\": rpc error: code = NotFound desc = could not find container \"2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da\": container with ID starting with 2c694efa9e0ce2ca169dc8821ed694c60f97333ec14b1ac60dc034f10062b0da not found: ID does not exist" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.338071 4765 scope.go:117] "RemoveContainer" containerID="9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886" Mar 19 11:29:26 crc kubenswrapper[4765]: E0319 11:29:26.339515 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886\": container with ID starting with 9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886 not found: ID does not exist" containerID="9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.339543 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886"} err="failed to get container status \"9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886\": rpc error: code = NotFound desc = could not find container \"9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886\": container with ID starting with 9ff53f95291e0bf58e3c1aea6ec6f99f203489f0089e502f4ff239ce91bb2886 not found: ID does not exist" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.339559 4765 scope.go:117] "RemoveContainer" containerID="355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521" Mar 19 11:29:26 crc kubenswrapper[4765]: E0319 11:29:26.339849 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521\": container with ID starting with 355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521 not found: ID does not exist" containerID="355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.339873 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521"} err="failed to get container status \"355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521\": rpc error: code = NotFound desc = could not find container \"355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521\": container with ID starting with 355ece33635cdffc5a854ef435c615bbbd6a42a0c0ebbd66f3cc31669344d521 not found: ID does not exist" Mar 19 11:29:26 crc kubenswrapper[4765]: I0319 11:29:26.375564 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" path="/var/lib/kubelet/pods/c54cfe34-1d42-488d-b462-cdb622dfcfc8/volumes" Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.352008 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.357356 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:29:27 crc kubenswrapper[4765]: E0319 11:29:27.357687 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.417381 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-host\") pod \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.417589 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9kdk2\" (UniqueName: \"kubernetes.io/projected/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-kube-api-access-9kdk2\") pod \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\" (UID: \"adedb4d2-1fa8-4a06-ab3f-1ee238793e11\") " Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.418626 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-host" (OuterVolumeSpecName: "host") pod "adedb4d2-1fa8-4a06-ab3f-1ee238793e11" (UID: "adedb4d2-1fa8-4a06-ab3f-1ee238793e11"). InnerVolumeSpecName "host". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.440203 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-kube-api-access-9kdk2" (OuterVolumeSpecName: "kube-api-access-9kdk2") pod "adedb4d2-1fa8-4a06-ab3f-1ee238793e11" (UID: "adedb4d2-1fa8-4a06-ab3f-1ee238793e11"). InnerVolumeSpecName "kube-api-access-9kdk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.520543 4765 reconciler_common.go:293] "Volume detached for volume \"host\" (UniqueName: \"kubernetes.io/host-path/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-host\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:27 crc kubenswrapper[4765]: I0319 11:29:27.520589 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9kdk2\" (UniqueName: \"kubernetes.io/projected/adedb4d2-1fa8-4a06-ab3f-1ee238793e11-kube-api-access-9kdk2\") on node \"crc\" DevicePath \"\"" Mar 19 11:29:28 crc kubenswrapper[4765]: I0319 11:29:28.249842 4765 scope.go:117] "RemoveContainer" containerID="c93dae9f77fafddeb887cfbe9b12f286ce4c62b1b2a5e39fdc0acc0ff771984f" Mar 19 11:29:28 crc kubenswrapper[4765]: I0319 11:29:28.249923 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/crc-debug-rxvqj" Mar 19 11:29:28 crc kubenswrapper[4765]: I0319 11:29:28.368532 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="adedb4d2-1fa8-4a06-ab3f-1ee238793e11" path="/var/lib/kubelet/pods/adedb4d2-1fa8-4a06-ab3f-1ee238793e11/volumes" Mar 19 11:29:41 crc kubenswrapper[4765]: I0319 11:29:41.356892 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:29:41 crc kubenswrapper[4765]: E0319 11:29:41.360147 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:29:56 crc kubenswrapper[4765]: I0319 11:29:56.356792 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:29:56 crc kubenswrapper[4765]: E0319 11:29:56.358126 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:29:57 crc kubenswrapper[4765]: I0319 11:29:57.414246 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openstack/swift-proxy-556979b4dc-zj26d" podUID="00e0de39-87cf-4a6e-8980-a294f329e430" containerName="proxy-server" probeResult="failure" output="HTTP probe failed with statuscode: 502" Mar 19 11:29:59 crc kubenswrapper[4765]: I0319 11:29:59.442380 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77865d778-4kfkp_446b5005-1960-413b-8ab2-f0da071ab4ba/barbican-api/0.log" Mar 19 11:29:59 crc kubenswrapper[4765]: I0319 11:29:59.597347 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-api-77865d778-4kfkp_446b5005-1960-413b-8ab2-f0da071ab4ba/barbican-api-log/0.log" Mar 19 11:29:59 crc kubenswrapper[4765]: I0319 11:29:59.621465 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6955bd84cd-t7qkv_65d1a29f-39b3-40d7-9db2-246fc05348cc/barbican-keystone-listener/0.log" Mar 19 11:29:59 crc kubenswrapper[4765]: I0319 11:29:59.690124 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-keystone-listener-6955bd84cd-t7qkv_65d1a29f-39b3-40d7-9db2-246fc05348cc/barbican-keystone-listener-log/0.log" Mar 19 11:29:59 crc kubenswrapper[4765]: I0319 11:29:59.842628 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f975545dc-7gv92_b18f0688-cbc1-49ae-a721-b964e45cc1ea/barbican-worker/0.log" Mar 19 11:29:59 crc kubenswrapper[4765]: I0319 11:29:59.887718 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_barbican-worker-7f975545dc-7gv92_b18f0688-cbc1-49ae-a721-b964e45cc1ea/barbican-worker-log/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.132574 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/ceilometer-central-agent/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.143022 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_bootstrap-edpm-deployment-openstack-edpm-ipam-8lsfm_0050f5ac-5380-49b0-98ad-fdd7c3b94f51/bootstrap-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.154475 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565330-2kgwr"] Mar 19 11:30:00 crc kubenswrapper[4765]: E0319 11:30:00.155038 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="adedb4d2-1fa8-4a06-ab3f-1ee238793e11" containerName="container-00" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.155063 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="adedb4d2-1fa8-4a06-ab3f-1ee238793e11" containerName="container-00" Mar 19 11:30:00 crc kubenswrapper[4765]: E0319 11:30:00.155086 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="registry-server" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.155094 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="registry-server" Mar 19 11:30:00 crc kubenswrapper[4765]: E0319 11:30:00.155108 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="extract-content" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.155116 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="extract-content" Mar 19 11:30:00 crc kubenswrapper[4765]: E0319 11:30:00.155141 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="extract-utilities" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.155152 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="extract-utilities" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.155360 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="adedb4d2-1fa8-4a06-ab3f-1ee238793e11" containerName="container-00" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.155390 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="c54cfe34-1d42-488d-b462-cdb622dfcfc8" containerName="registry-server" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.156168 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.165076 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.165356 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.165505 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.167511 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq"] Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.168917 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.171199 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/ceilometer-notification-agent/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.172756 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.173069 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.180465 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565330-2kgwr"] Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.206896 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36bffb14-466b-4835-a23e-8217512fa207-secret-volume\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.207834 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69s6d\" (UniqueName: \"kubernetes.io/projected/32da5e9d-72fb-43b8-903f-67dcbb0187bf-kube-api-access-69s6d\") pod \"auto-csr-approver-29565330-2kgwr\" (UID: \"32da5e9d-72fb-43b8-903f-67dcbb0187bf\") " pod="openshift-infra/auto-csr-approver-29565330-2kgwr" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.207894 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36bffb14-466b-4835-a23e-8217512fa207-config-volume\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.208013 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qlbb7\" (UniqueName: \"kubernetes.io/projected/36bffb14-466b-4835-a23e-8217512fa207-kube-api-access-qlbb7\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.208289 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq"] Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.310129 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36bffb14-466b-4835-a23e-8217512fa207-secret-volume\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.310222 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-69s6d\" (UniqueName: \"kubernetes.io/projected/32da5e9d-72fb-43b8-903f-67dcbb0187bf-kube-api-access-69s6d\") pod \"auto-csr-approver-29565330-2kgwr\" (UID: \"32da5e9d-72fb-43b8-903f-67dcbb0187bf\") " pod="openshift-infra/auto-csr-approver-29565330-2kgwr" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.310257 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36bffb14-466b-4835-a23e-8217512fa207-config-volume\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.310309 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qlbb7\" (UniqueName: \"kubernetes.io/projected/36bffb14-466b-4835-a23e-8217512fa207-kube-api-access-qlbb7\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.311509 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36bffb14-466b-4835-a23e-8217512fa207-config-volume\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.329751 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36bffb14-466b-4835-a23e-8217512fa207-secret-volume\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.337678 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-69s6d\" (UniqueName: \"kubernetes.io/projected/32da5e9d-72fb-43b8-903f-67dcbb0187bf-kube-api-access-69s6d\") pod \"auto-csr-approver-29565330-2kgwr\" (UID: \"32da5e9d-72fb-43b8-903f-67dcbb0187bf\") " pod="openshift-infra/auto-csr-approver-29565330-2kgwr" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.337797 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qlbb7\" (UniqueName: \"kubernetes.io/projected/36bffb14-466b-4835-a23e-8217512fa207-kube-api-access-qlbb7\") pod \"collect-profiles-29565330-qwtsq\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.355098 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/proxy-httpd/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.370591 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ceilometer-0_5ff62acd-7f88-46c9-bd52-150092370b2d/sg-core/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.466828 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7/cinder-api/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.481712 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.499082 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.630167 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-api-0_fd429c64-b8c5-4ca2-a290-b5c2b0ae71f7/cinder-api-log/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.761418 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_da64a060-18bb-4b34-9374-1fec5ad88ede/probe/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.811819 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_cinder-scheduler-0_da64a060-18bb-4b34-9374-1fec5ad88ede/cinder-scheduler/0.log" Mar 19 11:30:00 crc kubenswrapper[4765]: I0319 11:30:00.984999 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565330-2kgwr"] Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.180885 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq"] Mar 19 11:30:01 crc kubenswrapper[4765]: W0319 11:30:01.180889 4765 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod36bffb14_466b_4835_a23e_8217512fa207.slice/crio-fa5fc8fe9fcf17ffdc13575cbb4eab5f0084a2a716b2057a4159077c640053ba WatchSource:0}: Error finding container fa5fc8fe9fcf17ffdc13575cbb4eab5f0084a2a716b2057a4159077c640053ba: Status 404 returned error can't find the container with id fa5fc8fe9fcf17ffdc13575cbb4eab5f0084a2a716b2057a4159077c640053ba Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.289885 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-network-edpm-deployment-openstack-edpm-ipam-8lzpw_fd32b580-78f7-478e-ba1d-9d1a86b75f3a/configure-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.340355 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_configure-os-edpm-deployment-openstack-edpm-ipam-w4h6g_546dffbe-3a15-4074-a5be-deac4d1530e3/configure-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.552664 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-t5pnc_8051ab1f-9c27-4c1b-b9f9-9f883c67bea9/init/0.log" Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.586425 4765 generic.go:334] "Generic (PLEG): container finished" podID="36bffb14-466b-4835-a23e-8217512fa207" containerID="943929a0c7a345271963ce66c4e87aa6697a6a2d034cd31cfef517f2ea279fb2" exitCode=0 Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.586483 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" event={"ID":"36bffb14-466b-4835-a23e-8217512fa207","Type":"ContainerDied","Data":"943929a0c7a345271963ce66c4e87aa6697a6a2d034cd31cfef517f2ea279fb2"} Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.586552 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" event={"ID":"36bffb14-466b-4835-a23e-8217512fa207","Type":"ContainerStarted","Data":"fa5fc8fe9fcf17ffdc13575cbb4eab5f0084a2a716b2057a4159077c640053ba"} Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.587729 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" event={"ID":"32da5e9d-72fb-43b8-903f-67dcbb0187bf","Type":"ContainerStarted","Data":"4910348f49fa155fa4005b15a664fa080f6e740f36dad74ac40a499870208cc4"} Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.742077 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-t5pnc_8051ab1f-9c27-4c1b-b9f9-9f883c67bea9/init/0.log" Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.785269 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78c64bc9c5-t5pnc_8051ab1f-9c27-4c1b-b9f9-9f883c67bea9/dnsmasq-dns/0.log" Mar 19 11:30:01 crc kubenswrapper[4765]: I0319 11:30:01.877835 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_download-cache-edpm-deployment-openstack-edpm-ipam-xdnxr_5381bac5-1b71-4489-97fd-c49d0ae1783b/download-cache-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:02 crc kubenswrapper[4765]: I0319 11:30:02.037585 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac/glance-httpd/0.log" Mar 19 11:30:02 crc kubenswrapper[4765]: I0319 11:30:02.105938 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-external-api-0_0dd43b9a-4bf8-4e4a-9d09-d8b1a82d14ac/glance-log/0.log" Mar 19 11:30:02 crc kubenswrapper[4765]: I0319 11:30:02.224262 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02/glance-httpd/0.log" Mar 19 11:30:02 crc kubenswrapper[4765]: I0319 11:30:02.265325 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_glance-default-internal-api-0_e259ceb1-0bd2-4e09-bbf0-a7e28b35bb02/glance-log/0.log" Mar 19 11:30:02 crc kubenswrapper[4765]: I0319 11:30:02.465935 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6c6ff5646d-fmdz2_b506e362-44bf-4267-bea0-18131aa011fa/horizon/0.log" Mar 19 11:30:02 crc kubenswrapper[4765]: I0319 11:30:02.811266 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-certs-edpm-deployment-openstack-edpm-ipam-ppstd_ac71ef52-bfb2-44d0-be24-71e8e5e58475/install-certs-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:02 crc kubenswrapper[4765]: I0319 11:30:02.832490 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_horizon-6c6ff5646d-fmdz2_b506e362-44bf-4267-bea0-18131aa011fa/horizon-log/0.log" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.279912 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.284776 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_install-os-edpm-deployment-openstack-edpm-ipam-4x5vg_c09e1efc-02d2-4e0f-9e16-36d9627e0fb8/install-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.375331 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-79688b6ffc-lc92w_c59a3da3-7154-4531-9bf8-96771979b410/keystone-api/0.log" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.393745 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qlbb7\" (UniqueName: \"kubernetes.io/projected/36bffb14-466b-4835-a23e-8217512fa207-kube-api-access-qlbb7\") pod \"36bffb14-466b-4835-a23e-8217512fa207\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.397147 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36bffb14-466b-4835-a23e-8217512fa207-config-volume\") pod \"36bffb14-466b-4835-a23e-8217512fa207\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.397214 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36bffb14-466b-4835-a23e-8217512fa207-secret-volume\") pod \"36bffb14-466b-4835-a23e-8217512fa207\" (UID: \"36bffb14-466b-4835-a23e-8217512fa207\") " Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.398099 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/36bffb14-466b-4835-a23e-8217512fa207-config-volume" (OuterVolumeSpecName: "config-volume") pod "36bffb14-466b-4835-a23e-8217512fa207" (UID: "36bffb14-466b-4835-a23e-8217512fa207"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.414463 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/36bffb14-466b-4835-a23e-8217512fa207-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "36bffb14-466b-4835-a23e-8217512fa207" (UID: "36bffb14-466b-4835-a23e-8217512fa207"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.414617 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/36bffb14-466b-4835-a23e-8217512fa207-kube-api-access-qlbb7" (OuterVolumeSpecName: "kube-api-access-qlbb7") pod "36bffb14-466b-4835-a23e-8217512fa207" (UID: "36bffb14-466b-4835-a23e-8217512fa207"). InnerVolumeSpecName "kube-api-access-qlbb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.500530 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qlbb7\" (UniqueName: \"kubernetes.io/projected/36bffb14-466b-4835-a23e-8217512fa207-kube-api-access-qlbb7\") on node \"crc\" DevicePath \"\"" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.500590 4765 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/36bffb14-466b-4835-a23e-8217512fa207-config-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.500604 4765 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/36bffb14-466b-4835-a23e-8217512fa207-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.512164 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_keystone-cron-29565301-lwwfz_760a6c55-471d-478e-b75a-713476259c81/keystone-cron/0.log" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.570558 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_kube-state-metrics-0_06359a74-a7cd-45ab-bc64-ef3d71373e5a/kube-state-metrics/0.log" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.610393 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.610387 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29565330-qwtsq" event={"ID":"36bffb14-466b-4835-a23e-8217512fa207","Type":"ContainerDied","Data":"fa5fc8fe9fcf17ffdc13575cbb4eab5f0084a2a716b2057a4159077c640053ba"} Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.610539 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="fa5fc8fe9fcf17ffdc13575cbb4eab5f0084a2a716b2057a4159077c640053ba" Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.612636 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" event={"ID":"32da5e9d-72fb-43b8-903f-67dcbb0187bf","Type":"ContainerStarted","Data":"856850e4b24f916e772618b6af943d3b48262190519902ac549d6c3f7286b494"} Mar 19 11:30:03 crc kubenswrapper[4765]: I0319 11:30:03.632767 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" podStartSLOduration=1.644261043 podStartE2EDuration="3.632750414s" podCreationTimestamp="2026-03-19 11:30:00 +0000 UTC" firstStartedPulling="2026-03-19 11:30:00.994774435 +0000 UTC m=+4099.343719977" lastFinishedPulling="2026-03-19 11:30:02.983263796 +0000 UTC m=+4101.332209348" observedRunningTime="2026-03-19 11:30:03.626097185 +0000 UTC m=+4101.975042737" watchObservedRunningTime="2026-03-19 11:30:03.632750414 +0000 UTC m=+4101.981695956" Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.292416 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67b57ccc79-wx8k9_121bed92-a505-40d7-83f1-f3163088df2a/neutron-httpd/0.log" Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.349179 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-67b57ccc79-wx8k9_121bed92-a505-40d7-83f1-f3163088df2a/neutron-api/0.log" Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.379617 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz"] Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.389532 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29565285-mbxvz"] Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.413380 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_libvirt-edpm-deployment-openstack-edpm-ipam-kplgb_895f9304-5267-4b0b-acac-7e0d279b8866/libvirt-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.629455 4765 generic.go:334] "Generic (PLEG): container finished" podID="32da5e9d-72fb-43b8-903f-67dcbb0187bf" containerID="856850e4b24f916e772618b6af943d3b48262190519902ac549d6c3f7286b494" exitCode=0 Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.629537 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" event={"ID":"32da5e9d-72fb-43b8-903f-67dcbb0187bf","Type":"ContainerDied","Data":"856850e4b24f916e772618b6af943d3b48262190519902ac549d6c3f7286b494"} Mar 19 11:30:04 crc kubenswrapper[4765]: I0319 11:30:04.811224 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_neutron-metadata-edpm-deployment-openstack-edpm-ipam-2jkbt_5987706f-bbd1-4eeb-908e-dd158089aea5/neutron-metadata-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:05 crc kubenswrapper[4765]: I0319 11:30:05.405069 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4b32fc33-5dc9-44b4-9313-1ad458fe9473/nova-api-log/0.log" Mar 19 11:30:05 crc kubenswrapper[4765]: I0319 11:30:05.431286 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell0-conductor-0_3eb0de8e-0a7f-4324-8195-2bab8419c2ba/nova-cell0-conductor-conductor/0.log" Mar 19 11:30:05 crc kubenswrapper[4765]: I0319 11:30:05.889427 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-conductor-0_1ead2164-cda8-432f-b397-2866c55ccdbd/nova-cell1-conductor-conductor/0.log" Mar 19 11:30:05 crc kubenswrapper[4765]: I0319 11:30:05.944366 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-cell1-novncproxy-0_b92cc2fe-932d-4290-8331-225b2c5011d4/nova-cell1-novncproxy-novncproxy/0.log" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.004674 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-api-0_4b32fc33-5dc9-44b4-9313-1ad458fe9473/nova-api-api/0.log" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.204062 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.342783 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5603f135-db39-4e98-b372-6ec55cbc3351/nova-metadata-log/0.log" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.358893 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-69s6d\" (UniqueName: \"kubernetes.io/projected/32da5e9d-72fb-43b8-903f-67dcbb0187bf-kube-api-access-69s6d\") pod \"32da5e9d-72fb-43b8-903f-67dcbb0187bf\" (UID: \"32da5e9d-72fb-43b8-903f-67dcbb0187bf\") " Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.365199 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/32da5e9d-72fb-43b8-903f-67dcbb0187bf-kube-api-access-69s6d" (OuterVolumeSpecName: "kube-api-access-69s6d") pod "32da5e9d-72fb-43b8-903f-67dcbb0187bf" (UID: "32da5e9d-72fb-43b8-903f-67dcbb0187bf"). InnerVolumeSpecName "kube-api-access-69s6d". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.373002 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3549b53-da9a-4aa4-b4b7-9e89eb017916" path="/var/lib/kubelet/pods/c3549b53-da9a-4aa4-b4b7-9e89eb017916/volumes" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.461659 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-69s6d\" (UniqueName: \"kubernetes.io/projected/32da5e9d-72fb-43b8-903f-67dcbb0187bf-kube-api-access-69s6d\") on node \"crc\" DevicePath \"\"" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.647540 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" event={"ID":"32da5e9d-72fb-43b8-903f-67dcbb0187bf","Type":"ContainerDied","Data":"4910348f49fa155fa4005b15a664fa080f6e740f36dad74ac40a499870208cc4"} Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.647852 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4910348f49fa155fa4005b15a664fa080f6e740f36dad74ac40a499870208cc4" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.647618 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565330-2kgwr" Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.689450 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565324-52nm7"] Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.703064 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565324-52nm7"] Mar 19 11:30:06 crc kubenswrapper[4765]: I0319 11:30:06.905894 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-scheduler-0_584d891c-1a52-4300-b19a-51a3594bdccf/nova-scheduler-scheduler/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.013322 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-metadata-0_5603f135-db39-4e98-b372-6ec55cbc3351/nova-metadata-metadata/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.028229 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85cd8112-bca8-45df-b61a-d2690fbbfb16/mysql-bootstrap/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.074564 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_nova-edpm-deployment-openstack-edpm-ipam-skpp9_f9cf075c-03d2-4254-9ab9-5500d4f42186/nova-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.200218 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85cd8112-bca8-45df-b61a-d2690fbbfb16/mysql-bootstrap/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.353694 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-cell1-galera-0_85cd8112-bca8-45df-b61a-d2690fbbfb16/galera/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.368061 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adf887ce-99cf-47a0-89e8-2db5aa92a9ca/mysql-bootstrap/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.557629 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adf887ce-99cf-47a0-89e8-2db5aa92a9ca/mysql-bootstrap/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.581011 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstack-galera-0_adf887ce-99cf-47a0-89e8-2db5aa92a9ca/galera/0.log" Mar 19 11:30:07 crc kubenswrapper[4765]: I0319 11:30:07.635032 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_openstackclient_1cda8252-a988-49d1-a566-8d9989b86034/openstackclient/0.log" Mar 19 11:30:08 crc kubenswrapper[4765]: I0319 11:30:08.299705 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ct9xj_5272132e-561c-46b9-92c8-1714e40b3303/ovn-controller/0.log" Mar 19 11:30:08 crc kubenswrapper[4765]: I0319 11:30:08.356017 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:30:08 crc kubenswrapper[4765]: E0319 11:30:08.356265 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:30:08 crc kubenswrapper[4765]: I0319 11:30:08.383308 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad6ae4f9-0eef-4a46-b395-bc07375a436c" path="/var/lib/kubelet/pods/ad6ae4f9-0eef-4a46-b395-bc07375a436c/volumes" Mar 19 11:30:08 crc kubenswrapper[4765]: I0319 11:30:08.595563 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-metrics-sq8vg_c2dd6b2c-bf15-47e4-b9c6-775b176fbadb/openstack-network-exporter/0.log" Mar 19 11:30:08 crc kubenswrapper[4765]: I0319 11:30:08.789723 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovsdb-server-init/0.log" Mar 19 11:30:09 crc kubenswrapper[4765]: I0319 11:30:09.063230 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovs-vswitchd/0.log" Mar 19 11:30:09 crc kubenswrapper[4765]: I0319 11:30:09.111734 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovsdb-server-init/0.log" Mar 19 11:30:09 crc kubenswrapper[4765]: I0319 11:30:09.140815 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-controller-ovs-bmbgn_40f94856-44b1-42f4-9aa4-9b46f3fe13f3/ovsdb-server/0.log" Mar 19 11:30:09 crc kubenswrapper[4765]: I0319 11:30:09.403663 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_95192c6a-3899-4f62-bfca-47ad91bd17f1/openstack-network-exporter/0.log" Mar 19 11:30:09 crc kubenswrapper[4765]: I0319 11:30:09.441809 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-edpm-deployment-openstack-edpm-ipam-fgjhz_db0a9fa6-2229-425c-8170-ebcc7dce147f/ovn-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:09 crc kubenswrapper[4765]: I0319 11:30:09.454382 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovn-northd-0_95192c6a-3899-4f62-bfca-47ad91bd17f1/ovn-northd/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.148084 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_17555a74-a31f-4d09-8b23-b8c774024c58/openstack-network-exporter/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.155315 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-nb-0_17555a74-a31f-4d09-8b23-b8c774024c58/ovsdbserver-nb/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.375604 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_203ad8ad-1b9e-4191-99a0-7bfd9c193de8/openstack-network-exporter/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.440895 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ovsdbserver-sb-0_203ad8ad-1b9e-4191-99a0-7bfd9c193de8/ovsdbserver-sb/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.551466 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54bc4cb6bd-w8bvw_21f8be56-b9b5-4205-8de4-dd4d204b9f3d/placement-api/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.765324 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_placement-54bc4cb6bd-w8bvw_21f8be56-b9b5-4205-8de4-dd4d204b9f3d/placement-log/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.810612 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c3043d68-f6dc-4095-bc0e-62b2282dd297/setup-container/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.850844 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c3043d68-f6dc-4095-bc0e-62b2282dd297/setup-container/0.log" Mar 19 11:30:10 crc kubenswrapper[4765]: I0319 11:30:10.977760 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-cell1-server-0_c3043d68-f6dc-4095-bc0e-62b2282dd297/rabbitmq/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.067665 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53406a09-7bdd-4517-ac01-0823bce386bc/setup-container/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.287078 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53406a09-7bdd-4517-ac01-0823bce386bc/setup-container/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.347826 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_reboot-os-edpm-deployment-openstack-edpm-ipam-lrpbt_525004be-ff4e-4c2d-ad4d-0ed018eecc09/reboot-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.353990 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_rabbitmq-server-0_53406a09-7bdd-4517-ac01-0823bce386bc/rabbitmq/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.649493 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_redhat-edpm-deployment-openstack-edpm-ipam-zsxdz_af5d1fcd-a500-4d64-a86a-37cae82350d3/redhat-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.688716 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_repo-setup-edpm-deployment-openstack-edpm-ipam-kn2b8_ab6c9186-ce11-4085-9c4c-c0964cb170d8/repo-setup-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.912173 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_run-os-edpm-deployment-openstack-edpm-ipam-jcnm5_0d2d350b-0950-4e12-8ae8-57c8983079aa/run-os-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:11 crc kubenswrapper[4765]: I0319 11:30:11.972488 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_ssh-known-hosts-edpm-deployment-gc6dd_f46a25a2-f362-487c-9511-b9888a18b08e/ssh-known-hosts-edpm-deployment/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.394921 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556979b4dc-zj26d_00e0de39-87cf-4a6e-8980-a294f329e430/proxy-server/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.518890 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-ring-rebalance-f5w89_bd78fb4a-24b1-4fb7-8994-3668d29ff042/swift-ring-rebalance/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.578868 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-proxy-556979b4dc-zj26d_00e0de39-87cf-4a6e-8980-a294f329e430/proxy-httpd/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.716705 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-auditor/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.757121 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-reaper/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.945587 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-replicator/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.989343 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-auditor/0.log" Mar 19 11:30:12 crc kubenswrapper[4765]: I0319 11:30:12.994611 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/account-server/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.046236 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-replicator/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.211284 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-server/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.258228 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/container-updater/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.274751 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-auditor/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.316712 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-expirer/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.458321 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-replicator/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.521908 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-updater/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.529080 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/object-server/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.586730 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/rsync/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.681322 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_swift-storage-0_21734dce-e034-473f-a919-7026f837ede2/swift-recon-cron/0.log" Mar 19 11:30:13 crc kubenswrapper[4765]: I0319 11:30:13.960814 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_tempest-tests-tempest_65eabf0c-0a01-4d5b-aefd-d9ce064e1d66/tempest-tests-tempest-tests-runner/0.log" Mar 19 11:30:14 crc kubenswrapper[4765]: I0319 11:30:14.116135 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_test-operator-logs-pod-tempest-tempest-tests-tempest_9bf5c002-a318-47f6-8ba0-5a39c88daeff/test-operator-logs-container/0.log" Mar 19 11:30:14 crc kubenswrapper[4765]: I0319 11:30:14.261636 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_telemetry-edpm-deployment-openstack-edpm-ipam-r2ttq_611d61c3-8dd1-46e4-a579-ded4e91917ed/telemetry-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:14 crc kubenswrapper[4765]: I0319 11:30:14.378872 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_validate-network-edpm-deployment-openstack-edpm-ipam-xnkdx_3a573215-571a-49dc-9903-82134a77d196/validate-network-edpm-deployment-openstack-edpm-ipam/0.log" Mar 19 11:30:16 crc kubenswrapper[4765]: I0319 11:30:16.558632 4765 scope.go:117] "RemoveContainer" containerID="36f29a72904deb4ccb3bd3a786e69b9b23b838596177b6a63037afd7c4dcb067" Mar 19 11:30:16 crc kubenswrapper[4765]: I0319 11:30:16.609710 4765 scope.go:117] "RemoveContainer" containerID="ad104457548f180ec2ec3f0bb98a753afc784404a036958379594b94817b4bd4" Mar 19 11:30:22 crc kubenswrapper[4765]: I0319 11:30:22.363841 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:30:22 crc kubenswrapper[4765]: E0319 11:30:22.364763 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:30:25 crc kubenswrapper[4765]: I0319 11:30:25.123809 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_memcached-0_81d90cd2-d47a-47c5-aeff-20f377ed9159/memcached/0.log" Mar 19 11:30:35 crc kubenswrapper[4765]: I0319 11:30:35.356783 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:30:35 crc kubenswrapper[4765]: I0319 11:30:35.927106 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"d10207e38e3d71513e5ccfc9b07c224dac480f88917cc773ac045cc45750c785"} Mar 19 11:30:41 crc kubenswrapper[4765]: I0319 11:30:41.790454 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/util/0.log" Mar 19 11:30:41 crc kubenswrapper[4765]: I0319 11:30:41.941434 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/pull/0.log" Mar 19 11:30:41 crc kubenswrapper[4765]: I0319 11:30:41.973826 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/util/0.log" Mar 19 11:30:41 crc kubenswrapper[4765]: I0319 11:30:41.985047 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/pull/0.log" Mar 19 11:30:42 crc kubenswrapper[4765]: I0319 11:30:42.147202 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/util/0.log" Mar 19 11:30:42 crc kubenswrapper[4765]: I0319 11:30:42.148217 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/pull/0.log" Mar 19 11:30:42 crc kubenswrapper[4765]: I0319 11:30:42.180344 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_3372c49e3507b610e9d5a42b92ce3db446a65d7677cc3a3767be9d36d7t4rck_b340ebf3-8897-41c3-8a3e-733e4afc3fdf/extract/0.log" Mar 19 11:30:42 crc kubenswrapper[4765]: I0319 11:30:42.386108 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-kvrz2_1954f819-78c2-46fd-a6bf-c626d50ef527/manager/0.log" Mar 19 11:30:42 crc kubenswrapper[4765]: I0319 11:30:42.583580 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-6h4tg_c5e9a9c2-b7c8-4e1a-8c60-e6fcb0616e01/manager/0.log" Mar 19 11:30:42 crc kubenswrapper[4765]: I0319 11:30:42.697231 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-kzvs8_3e1ee5ea-abd4-4a73-840e-43fbd3732cfd/manager/0.log" Mar 19 11:30:42 crc kubenswrapper[4765]: I0319 11:30:42.906799 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-4zf8v_123b9f81-d315-44b3-a6ec-d777cc18ab7b/manager/0.log" Mar 19 11:30:43 crc kubenswrapper[4765]: I0319 11:30:43.085703 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-qrqf8_047a8026-b206-4eb6-9630-3b550af68d3a/manager/0.log" Mar 19 11:30:43 crc kubenswrapper[4765]: I0319 11:30:43.381017 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-p27jn_cdeba207-ced7-4575-9c08-c001d85b0a93/manager/0.log" Mar 19 11:30:43 crc kubenswrapper[4765]: I0319 11:30:43.590813 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-7b9c774f96-2gnht_0cd862fe-c896-4fa6-a9ba-b1af6441f777/manager/0.log" Mar 19 11:30:43 crc kubenswrapper[4765]: I0319 11:30:43.663889 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-q9lpg_b94397e1-cedc-4048-9253-12c60b0a9bfd/manager/0.log" Mar 19 11:30:43 crc kubenswrapper[4765]: I0319 11:30:43.830130 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-x8k5g_9225dfe1-877e-43a2-9034-0e355019aa04/manager/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.003229 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-bg8b9_d9dca6f4-a577-44fa-a959-8398fb57dca0/manager/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.089022 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-ncc44_1d54708d-8829-411d-a632-ce3b53b7aeaa/manager/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.146292 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-78t28_adc31858-63eb-4d03-b79c-c1a4054725af/manager/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.314518 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-wxsgc_6647190b-c26b-4c57-bc84-7e5cfe6a5649/manager/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.346115 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-b4kfn_473e9670-e72d-4e54-8b06-9d73666cbfc0/manager/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.499997 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-vqghn_981806c8-2390-44ac-a6f8-81c5f5bb0374/manager/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.600830 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-76ccd786f6-h42th_538ce45d-9424-41e4-8d9e-ff63db0df6be/operator/0.log" Mar 19 11:30:44 crc kubenswrapper[4765]: I0319 11:30:44.836433 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-fcqz8_0836451e-5b5f-47bd-8722-283ab5d34a5c/registry-server/0.log" Mar 19 11:30:45 crc kubenswrapper[4765]: I0319 11:30:45.119389 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-v78t5_412ddd32-a861-4cec-8d5e-bb21069835e9/manager/0.log" Mar 19 11:30:45 crc kubenswrapper[4765]: I0319 11:30:45.219380 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-2zbqz_4be8b0ab-5eda-4bf9-8fc4-20dcbe9c406d/manager/0.log" Mar 19 11:30:45 crc kubenswrapper[4765]: I0319 11:30:45.383478 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_rabbitmq-cluster-operator-manager-668c99d594-hsxfs_408f748b-ca2b-4ae8-8994-63d7da422df9/operator/0.log" Mar 19 11:30:45 crc kubenswrapper[4765]: I0319 11:30:45.616794 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-hfm25_c3466125-06fe-4c5d-872d-a778806a0e23/manager/0.log" Mar 19 11:30:45 crc kubenswrapper[4765]: I0319 11:30:45.769770 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-7wsnh_ae2caf34-b7b2-486c-9e9e-a48cf04eed87/manager/0.log" Mar 19 11:30:45 crc kubenswrapper[4765]: I0319 11:30:45.835302 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-c9c9c96bc-4hzdg_970bc693-0463-4dfe-8870-fac695fffcae/manager/0.log" Mar 19 11:30:45 crc kubenswrapper[4765]: I0319 11:30:45.861120 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-xdlrz_e843ba99-8859-41d7-9142-ed9227b4d8e1/manager/0.log" Mar 19 11:30:46 crc kubenswrapper[4765]: I0319 11:30:46.003942 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-dbv47_3c3ef321-6a40-4d2e-a414-ad6a65cd32cf/manager/0.log" Mar 19 11:31:04 crc kubenswrapper[4765]: I0319 11:31:04.877414 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-jdqns_dc04fc5b-a6ad-45a8-a26f-402f79dd3ba0/control-plane-machine-set-operator/0.log" Mar 19 11:31:05 crc kubenswrapper[4765]: I0319 11:31:05.077151 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hqvg_993cdcd1-8323-49aa-b587-5a8c344a2077/kube-rbac-proxy/0.log" Mar 19 11:31:05 crc kubenswrapper[4765]: I0319 11:31:05.096547 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-7hqvg_993cdcd1-8323-49aa-b587-5a8c344a2077/machine-api-operator/0.log" Mar 19 11:31:17 crc kubenswrapper[4765]: I0319 11:31:17.845142 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-nb6dw_4a6670fe-5988-4bfd-8468-b2a5f6cd9997/cert-manager-controller/0.log" Mar 19 11:31:17 crc kubenswrapper[4765]: I0319 11:31:17.998169 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-tp66g_67148642-28c7-4217-b91a-3badb42c4c38/cert-manager-cainjector/0.log" Mar 19 11:31:18 crc kubenswrapper[4765]: I0319 11:31:18.044355 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-w862m_1b499d05-d228-4268-8b1f-8b3c8687870f/cert-manager-webhook/0.log" Mar 19 11:31:30 crc kubenswrapper[4765]: I0319 11:31:30.967537 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-x2vbf_76583484-f8aa-4a95-8450-206a93fb2b6c/nmstate-console-plugin/0.log" Mar 19 11:31:31 crc kubenswrapper[4765]: I0319 11:31:31.128011 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-n2hqs_e3d573a8-f56f-45e5-9905-5810e82af6ac/nmstate-handler/0.log" Mar 19 11:31:31 crc kubenswrapper[4765]: I0319 11:31:31.184695 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z6jbc_5a5acf7a-e38f-4ef1-9576-eae0d8e9a582/kube-rbac-proxy/0.log" Mar 19 11:31:31 crc kubenswrapper[4765]: I0319 11:31:31.271390 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-z6jbc_5a5acf7a-e38f-4ef1-9576-eae0d8e9a582/nmstate-metrics/0.log" Mar 19 11:31:31 crc kubenswrapper[4765]: I0319 11:31:31.352086 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-9bsj5_75557987-e600-4f26-b66a-45a76da143cf/nmstate-operator/0.log" Mar 19 11:31:31 crc kubenswrapper[4765]: I0319 11:31:31.437324 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-rl6q7_ac4e199b-7261-41f0-b9e9-51b167be05a7/nmstate-webhook/0.log" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.151876 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565332-k5j8m"] Mar 19 11:32:00 crc kubenswrapper[4765]: E0319 11:32:00.153425 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="36bffb14-466b-4835-a23e-8217512fa207" containerName="collect-profiles" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.153444 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="36bffb14-466b-4835-a23e-8217512fa207" containerName="collect-profiles" Mar 19 11:32:00 crc kubenswrapper[4765]: E0319 11:32:00.153509 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="32da5e9d-72fb-43b8-903f-67dcbb0187bf" containerName="oc" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.153516 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="32da5e9d-72fb-43b8-903f-67dcbb0187bf" containerName="oc" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.153732 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="36bffb14-466b-4835-a23e-8217512fa207" containerName="collect-profiles" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.153755 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="32da5e9d-72fb-43b8-903f-67dcbb0187bf" containerName="oc" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.154650 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.156538 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.156722 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.157030 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.176643 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565332-k5j8m"] Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.190638 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrfvp\" (UniqueName: \"kubernetes.io/projected/a2de0d93-78ae-47be-94a0-29c967a62a6b-kube-api-access-xrfvp\") pod \"auto-csr-approver-29565332-k5j8m\" (UID: \"a2de0d93-78ae-47be-94a0-29c967a62a6b\") " pod="openshift-infra/auto-csr-approver-29565332-k5j8m" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.294212 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xrfvp\" (UniqueName: \"kubernetes.io/projected/a2de0d93-78ae-47be-94a0-29c967a62a6b-kube-api-access-xrfvp\") pod \"auto-csr-approver-29565332-k5j8m\" (UID: \"a2de0d93-78ae-47be-94a0-29c967a62a6b\") " pod="openshift-infra/auto-csr-approver-29565332-k5j8m" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.314395 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrfvp\" (UniqueName: \"kubernetes.io/projected/a2de0d93-78ae-47be-94a0-29c967a62a6b-kube-api-access-xrfvp\") pod \"auto-csr-approver-29565332-k5j8m\" (UID: \"a2de0d93-78ae-47be-94a0-29c967a62a6b\") " pod="openshift-infra/auto-csr-approver-29565332-k5j8m" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.477262 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.876467 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2rmkf_0f10db70-5575-427a-b0de-f36a4c0a5feb/kube-rbac-proxy/0.log" Mar 19 11:32:00 crc kubenswrapper[4765]: I0319 11:32:00.995920 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2rmkf_0f10db70-5575-427a-b0de-f36a4c0a5feb/controller/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.000324 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565332-k5j8m"] Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.020226 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.161534 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.579321 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.608909 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.638900 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.643039 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.673392 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" event={"ID":"a2de0d93-78ae-47be-94a0-29c967a62a6b","Type":"ContainerStarted","Data":"eb37a00b95ca8f5f36ff138535c39a7c93683a9123571c53107d455cf6d6cafa"} Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.848236 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.855119 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.856318 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:32:01 crc kubenswrapper[4765]: I0319 11:32:01.890811 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.532453 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/controller/0.log" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.538030 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-metrics/0.log" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.544383 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-frr-files/0.log" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.577469 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/cp-reloader/0.log" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.689138 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" event={"ID":"a2de0d93-78ae-47be-94a0-29c967a62a6b","Type":"ContainerStarted","Data":"8c214342dac72475636a42bad146929000979cdf34f5e47daa7af5009d771684"} Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.705950 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" podStartSLOduration=1.836979628 podStartE2EDuration="2.705932183s" podCreationTimestamp="2026-03-19 11:32:00 +0000 UTC" firstStartedPulling="2026-03-19 11:32:01.019926926 +0000 UTC m=+4219.368872468" lastFinishedPulling="2026-03-19 11:32:01.888879471 +0000 UTC m=+4220.237825023" observedRunningTime="2026-03-19 11:32:02.700308291 +0000 UTC m=+4221.049253833" watchObservedRunningTime="2026-03-19 11:32:02.705932183 +0000 UTC m=+4221.054877725" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.774905 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/kube-rbac-proxy-frr/0.log" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.849940 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/frr-metrics/0.log" Mar 19 11:32:02 crc kubenswrapper[4765]: I0319 11:32:02.850796 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/kube-rbac-proxy/0.log" Mar 19 11:32:03 crc kubenswrapper[4765]: I0319 11:32:03.070147 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/reloader/0.log" Mar 19 11:32:03 crc kubenswrapper[4765]: I0319 11:32:03.224794 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-f6fhw_bff02354-3273-4396-b996-06a749a9692f/frr-k8s-webhook-server/0.log" Mar 19 11:32:03 crc kubenswrapper[4765]: I0319 11:32:03.430120 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-7b6fd59d6c-c8kfg_fc004464-2eb9-4b7d-addf-91b7b69e01b6/manager/0.log" Mar 19 11:32:03 crc kubenswrapper[4765]: I0319 11:32:03.641614 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-757bfbd67b-zzdvj_d3f409a1-a978-47fc-9907-fec4a720ae18/webhook-server/0.log" Mar 19 11:32:03 crc kubenswrapper[4765]: I0319 11:32:03.703356 4765 generic.go:334] "Generic (PLEG): container finished" podID="a2de0d93-78ae-47be-94a0-29c967a62a6b" containerID="8c214342dac72475636a42bad146929000979cdf34f5e47daa7af5009d771684" exitCode=0 Mar 19 11:32:03 crc kubenswrapper[4765]: I0319 11:32:03.703413 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" event={"ID":"a2de0d93-78ae-47be-94a0-29c967a62a6b","Type":"ContainerDied","Data":"8c214342dac72475636a42bad146929000979cdf34f5e47daa7af5009d771684"} Mar 19 11:32:03 crc kubenswrapper[4765]: I0319 11:32:03.730079 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-czcsr_afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71/kube-rbac-proxy/0.log" Mar 19 11:32:04 crc kubenswrapper[4765]: I0319 11:32:04.428561 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-czcsr_afbbbf15-03b8-4ee7-b4ce-dd9e01a2cd71/speaker/0.log" Mar 19 11:32:04 crc kubenswrapper[4765]: I0319 11:32:04.507171 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-w58kf_0f2f40b4-6884-47cf-9845-7a45001ceda5/frr/0.log" Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.049502 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.107071 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrfvp\" (UniqueName: \"kubernetes.io/projected/a2de0d93-78ae-47be-94a0-29c967a62a6b-kube-api-access-xrfvp\") pod \"a2de0d93-78ae-47be-94a0-29c967a62a6b\" (UID: \"a2de0d93-78ae-47be-94a0-29c967a62a6b\") " Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.120168 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2de0d93-78ae-47be-94a0-29c967a62a6b-kube-api-access-xrfvp" (OuterVolumeSpecName: "kube-api-access-xrfvp") pod "a2de0d93-78ae-47be-94a0-29c967a62a6b" (UID: "a2de0d93-78ae-47be-94a0-29c967a62a6b"). InnerVolumeSpecName "kube-api-access-xrfvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.208605 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xrfvp\" (UniqueName: \"kubernetes.io/projected/a2de0d93-78ae-47be-94a0-29c967a62a6b-kube-api-access-xrfvp\") on node \"crc\" DevicePath \"\"" Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.464474 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565326-4vw7m"] Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.476773 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565326-4vw7m"] Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.722492 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" event={"ID":"a2de0d93-78ae-47be-94a0-29c967a62a6b","Type":"ContainerDied","Data":"eb37a00b95ca8f5f36ff138535c39a7c93683a9123571c53107d455cf6d6cafa"} Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.722538 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="eb37a00b95ca8f5f36ff138535c39a7c93683a9123571c53107d455cf6d6cafa" Mar 19 11:32:05 crc kubenswrapper[4765]: I0319 11:32:05.722612 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565332-k5j8m" Mar 19 11:32:06 crc kubenswrapper[4765]: I0319 11:32:06.367743 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7d11b837-8c43-45ee-b838-71d13a8b0919" path="/var/lib/kubelet/pods/7d11b837-8c43-45ee-b838-71d13a8b0919/volumes" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.138769 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/util/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.349678 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/pull/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.366990 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/util/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.389834 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/pull/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.540353 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/util/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.585686 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/pull/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.597378 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874zc49z_2bd39f53-5aca-44bd-93ed-bff9ffafb381/extract/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.755265 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/util/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.821914 4765 scope.go:117] "RemoveContainer" containerID="4ad4991be769ac3ad1abcf171d281476125074255f30dae44bdabe9adfa7efa1" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.925812 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/util/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.978993 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/pull/0.log" Mar 19 11:32:16 crc kubenswrapper[4765]: I0319 11:32:16.989742 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/pull/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.115526 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/pull/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.148315 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/util/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.154179 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c92r2_379c6607-c195-4779-83b7-bdc20f7cda09/extract/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.336939 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-utilities/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.508123 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-content/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.512440 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-content/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.515330 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-utilities/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.719862 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-utilities/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.724192 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/extract-content/0.log" Mar 19 11:32:17 crc kubenswrapper[4765]: I0319 11:32:17.961996 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-utilities/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.261914 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-content/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.276768 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-utilities/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.294041 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-content/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.437506 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-cd4rs_00429786-088e-4d39-be6e-050615aeba42/registry-server/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.464581 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-utilities/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.516951 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/extract-content/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.747224 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-6xvzl_5e3d8e97-79f8-43d2-acf6-f20ef33cadd3/marketplace-operator/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.863044 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-6rrpf_1c5d371b-d02f-497c-af99-4c138232f8c0/registry-server/0.log" Mar 19 11:32:18 crc kubenswrapper[4765]: I0319 11:32:18.937286 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-utilities/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.124978 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-content/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.138213 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-utilities/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.138753 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-content/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.326904 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-utilities/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.331715 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/extract-content/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.486708 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-9chfw_df325686-add0-407b-afdf-f9093391d64c/registry-server/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.551436 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-utilities/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.725679 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-utilities/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.733863 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-content/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.772096 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-content/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.969662 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-content/0.log" Mar 19 11:32:19 crc kubenswrapper[4765]: I0319 11:32:19.980937 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/extract-utilities/0.log" Mar 19 11:32:20 crc kubenswrapper[4765]: I0319 11:32:20.378202 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-prwdk_b99b05c7-b9bf-4814-9d29-b5d9076a98a8/registry-server/0.log" Mar 19 11:32:53 crc kubenswrapper[4765]: E0319 11:32:53.344866 4765 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.129.56.13:49620->38.129.56.13:35029: write tcp 38.129.56.13:49620->38.129.56.13:35029: write: broken pipe Mar 19 11:33:01 crc kubenswrapper[4765]: I0319 11:33:01.656072 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:33:01 crc kubenswrapper[4765]: I0319 11:33:01.656547 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:33:31 crc kubenswrapper[4765]: I0319 11:33:31.656429 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:33:31 crc kubenswrapper[4765]: I0319 11:33:31.656906 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.151272 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565334-7v4ls"] Mar 19 11:34:00 crc kubenswrapper[4765]: E0319 11:34:00.152267 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2de0d93-78ae-47be-94a0-29c967a62a6b" containerName="oc" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.152282 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2de0d93-78ae-47be-94a0-29c967a62a6b" containerName="oc" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.152561 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2de0d93-78ae-47be-94a0-29c967a62a6b" containerName="oc" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.154353 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565334-7v4ls" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.156718 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.160274 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.160303 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.161040 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565334-7v4ls"] Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.243419 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xwh25\" (UniqueName: \"kubernetes.io/projected/d725f860-71d3-4d54-a6bc-916b5f402dbb-kube-api-access-xwh25\") pod \"auto-csr-approver-29565334-7v4ls\" (UID: \"d725f860-71d3-4d54-a6bc-916b5f402dbb\") " pod="openshift-infra/auto-csr-approver-29565334-7v4ls" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.345916 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xwh25\" (UniqueName: \"kubernetes.io/projected/d725f860-71d3-4d54-a6bc-916b5f402dbb-kube-api-access-xwh25\") pod \"auto-csr-approver-29565334-7v4ls\" (UID: \"d725f860-71d3-4d54-a6bc-916b5f402dbb\") " pod="openshift-infra/auto-csr-approver-29565334-7v4ls" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.365230 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xwh25\" (UniqueName: \"kubernetes.io/projected/d725f860-71d3-4d54-a6bc-916b5f402dbb-kube-api-access-xwh25\") pod \"auto-csr-approver-29565334-7v4ls\" (UID: \"d725f860-71d3-4d54-a6bc-916b5f402dbb\") " pod="openshift-infra/auto-csr-approver-29565334-7v4ls" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.473646 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565334-7v4ls" Mar 19 11:34:00 crc kubenswrapper[4765]: I0319 11:34:00.920115 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565334-7v4ls"] Mar 19 11:34:01 crc kubenswrapper[4765]: I0319 11:34:01.656182 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:34:01 crc kubenswrapper[4765]: I0319 11:34:01.656238 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:34:01 crc kubenswrapper[4765]: I0319 11:34:01.656277 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 11:34:01 crc kubenswrapper[4765]: I0319 11:34:01.656978 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d10207e38e3d71513e5ccfc9b07c224dac480f88917cc773ac045cc45750c785"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 11:34:01 crc kubenswrapper[4765]: I0319 11:34:01.657026 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://d10207e38e3d71513e5ccfc9b07c224dac480f88917cc773ac045cc45750c785" gracePeriod=600 Mar 19 11:34:01 crc kubenswrapper[4765]: I0319 11:34:01.747606 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565334-7v4ls" event={"ID":"d725f860-71d3-4d54-a6bc-916b5f402dbb","Type":"ContainerStarted","Data":"94b2877ca78ae9d127ad5799e7ddc18a342b38d139abd47be93ccb0db5319df1"} Mar 19 11:34:02 crc kubenswrapper[4765]: I0319 11:34:02.763114 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="d10207e38e3d71513e5ccfc9b07c224dac480f88917cc773ac045cc45750c785" exitCode=0 Mar 19 11:34:02 crc kubenswrapper[4765]: I0319 11:34:02.763203 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"d10207e38e3d71513e5ccfc9b07c224dac480f88917cc773ac045cc45750c785"} Mar 19 11:34:02 crc kubenswrapper[4765]: I0319 11:34:02.764181 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerStarted","Data":"2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1"} Mar 19 11:34:02 crc kubenswrapper[4765]: I0319 11:34:02.764216 4765 scope.go:117] "RemoveContainer" containerID="3a9adfc926b90648fbd3a3bd0384ffef3d5be74526350ea826846435a9ebdf9e" Mar 19 11:34:02 crc kubenswrapper[4765]: I0319 11:34:02.770943 4765 generic.go:334] "Generic (PLEG): container finished" podID="d725f860-71d3-4d54-a6bc-916b5f402dbb" containerID="f9058ac4557299704d9bd543bac1b5947fe5ac15f00efa7def9c53c82c52f162" exitCode=0 Mar 19 11:34:02 crc kubenswrapper[4765]: I0319 11:34:02.771038 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565334-7v4ls" event={"ID":"d725f860-71d3-4d54-a6bc-916b5f402dbb","Type":"ContainerDied","Data":"f9058ac4557299704d9bd543bac1b5947fe5ac15f00efa7def9c53c82c52f162"} Mar 19 11:34:04 crc kubenswrapper[4765]: I0319 11:34:04.125990 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565334-7v4ls" Mar 19 11:34:04 crc kubenswrapper[4765]: I0319 11:34:04.250889 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xwh25\" (UniqueName: \"kubernetes.io/projected/d725f860-71d3-4d54-a6bc-916b5f402dbb-kube-api-access-xwh25\") pod \"d725f860-71d3-4d54-a6bc-916b5f402dbb\" (UID: \"d725f860-71d3-4d54-a6bc-916b5f402dbb\") " Mar 19 11:34:04 crc kubenswrapper[4765]: I0319 11:34:04.256030 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d725f860-71d3-4d54-a6bc-916b5f402dbb-kube-api-access-xwh25" (OuterVolumeSpecName: "kube-api-access-xwh25") pod "d725f860-71d3-4d54-a6bc-916b5f402dbb" (UID: "d725f860-71d3-4d54-a6bc-916b5f402dbb"). InnerVolumeSpecName "kube-api-access-xwh25". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:34:04 crc kubenswrapper[4765]: I0319 11:34:04.353753 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xwh25\" (UniqueName: \"kubernetes.io/projected/d725f860-71d3-4d54-a6bc-916b5f402dbb-kube-api-access-xwh25\") on node \"crc\" DevicePath \"\"" Mar 19 11:34:04 crc kubenswrapper[4765]: I0319 11:34:04.796886 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565334-7v4ls" event={"ID":"d725f860-71d3-4d54-a6bc-916b5f402dbb","Type":"ContainerDied","Data":"94b2877ca78ae9d127ad5799e7ddc18a342b38d139abd47be93ccb0db5319df1"} Mar 19 11:34:04 crc kubenswrapper[4765]: I0319 11:34:04.796931 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="94b2877ca78ae9d127ad5799e7ddc18a342b38d139abd47be93ccb0db5319df1" Mar 19 11:34:04 crc kubenswrapper[4765]: I0319 11:34:04.797019 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565334-7v4ls" Mar 19 11:34:05 crc kubenswrapper[4765]: I0319 11:34:05.210642 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565328-x8xgn"] Mar 19 11:34:05 crc kubenswrapper[4765]: I0319 11:34:05.219177 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565328-x8xgn"] Mar 19 11:34:06 crc kubenswrapper[4765]: I0319 11:34:06.367835 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="061de557-4131-4b58-b828-2a8611198635" path="/var/lib/kubelet/pods/061de557-4131-4b58-b828-2a8611198635/volumes" Mar 19 11:34:10 crc kubenswrapper[4765]: I0319 11:34:10.994107 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-d9npf"] Mar 19 11:34:10 crc kubenswrapper[4765]: E0319 11:34:10.996771 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d725f860-71d3-4d54-a6bc-916b5f402dbb" containerName="oc" Mar 19 11:34:10 crc kubenswrapper[4765]: I0319 11:34:10.996981 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d725f860-71d3-4d54-a6bc-916b5f402dbb" containerName="oc" Mar 19 11:34:10 crc kubenswrapper[4765]: I0319 11:34:10.997383 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d725f860-71d3-4d54-a6bc-916b5f402dbb" containerName="oc" Mar 19 11:34:10 crc kubenswrapper[4765]: I0319 11:34:10.999929 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.009779 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9npf"] Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.101499 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-utilities\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.101615 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-catalog-content\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.101649 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pk8xb\" (UniqueName: \"kubernetes.io/projected/29187046-f9f6-40fd-93cb-8db046fadbaf-kube-api-access-pk8xb\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.203069 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-catalog-content\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.203118 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pk8xb\" (UniqueName: \"kubernetes.io/projected/29187046-f9f6-40fd-93cb-8db046fadbaf-kube-api-access-pk8xb\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.203230 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-utilities\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.203806 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-catalog-content\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.203834 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-utilities\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.243873 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pk8xb\" (UniqueName: \"kubernetes.io/projected/29187046-f9f6-40fd-93cb-8db046fadbaf-kube-api-access-pk8xb\") pod \"redhat-marketplace-d9npf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.335626 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.791183 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9npf"] Mar 19 11:34:11 crc kubenswrapper[4765]: I0319 11:34:11.878719 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9npf" event={"ID":"29187046-f9f6-40fd-93cb-8db046fadbaf","Type":"ContainerStarted","Data":"14d77abc8b42b1e41ac42d93d9a6399c15009d9c47a56a98af77ffa2062f900d"} Mar 19 11:34:12 crc kubenswrapper[4765]: I0319 11:34:12.887763 4765 generic.go:334] "Generic (PLEG): container finished" podID="9acef296-5e79-49cf-867a-124137b68d69" containerID="c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba" exitCode=0 Mar 19 11:34:12 crc kubenswrapper[4765]: I0319 11:34:12.887867 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" event={"ID":"9acef296-5e79-49cf-867a-124137b68d69","Type":"ContainerDied","Data":"c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba"} Mar 19 11:34:12 crc kubenswrapper[4765]: I0319 11:34:12.889117 4765 scope.go:117] "RemoveContainer" containerID="c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba" Mar 19 11:34:12 crc kubenswrapper[4765]: I0319 11:34:12.891111 4765 generic.go:334] "Generic (PLEG): container finished" podID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerID="515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549" exitCode=0 Mar 19 11:34:12 crc kubenswrapper[4765]: I0319 11:34:12.891245 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9npf" event={"ID":"29187046-f9f6-40fd-93cb-8db046fadbaf","Type":"ContainerDied","Data":"515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549"} Mar 19 11:34:13 crc kubenswrapper[4765]: I0319 11:34:13.375488 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fkmjb_must-gather-4rn4d_9acef296-5e79-49cf-867a-124137b68d69/gather/0.log" Mar 19 11:34:16 crc kubenswrapper[4765]: I0319 11:34:16.914649 4765 scope.go:117] "RemoveContainer" containerID="7054ecdfe178fde890d8cadca1b134e6d33df709e2661f54976db2e9b61d2060" Mar 19 11:34:16 crc kubenswrapper[4765]: I0319 11:34:16.927555 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9npf" event={"ID":"29187046-f9f6-40fd-93cb-8db046fadbaf","Type":"ContainerStarted","Data":"98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3"} Mar 19 11:34:18 crc kubenswrapper[4765]: I0319 11:34:18.948515 4765 generic.go:334] "Generic (PLEG): container finished" podID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerID="98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3" exitCode=0 Mar 19 11:34:18 crc kubenswrapper[4765]: I0319 11:34:18.948596 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9npf" event={"ID":"29187046-f9f6-40fd-93cb-8db046fadbaf","Type":"ContainerDied","Data":"98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3"} Mar 19 11:34:20 crc kubenswrapper[4765]: I0319 11:34:20.966947 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9npf" event={"ID":"29187046-f9f6-40fd-93cb-8db046fadbaf","Type":"ContainerStarted","Data":"a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c"} Mar 19 11:34:21 crc kubenswrapper[4765]: I0319 11:34:21.336758 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:21 crc kubenswrapper[4765]: I0319 11:34:21.337053 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:22 crc kubenswrapper[4765]: I0319 11:34:22.628674 4765 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-marketplace-d9npf" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="registry-server" probeResult="failure" output=< Mar 19 11:34:22 crc kubenswrapper[4765]: timeout: failed to connect service ":50051" within 1s Mar 19 11:34:22 crc kubenswrapper[4765]: > Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.019330 4765 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-d9npf" podStartSLOduration=8.216706477 podStartE2EDuration="15.019308268s" podCreationTimestamp="2026-03-19 11:34:10 +0000 UTC" firstStartedPulling="2026-03-19 11:34:12.893229432 +0000 UTC m=+4351.242174984" lastFinishedPulling="2026-03-19 11:34:19.695831233 +0000 UTC m=+4358.044776775" observedRunningTime="2026-03-19 11:34:20.997171654 +0000 UTC m=+4359.346117196" watchObservedRunningTime="2026-03-19 11:34:25.019308268 +0000 UTC m=+4363.368253810" Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.029655 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-fkmjb/must-gather-4rn4d"] Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.029979 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" podUID="9acef296-5e79-49cf-867a-124137b68d69" containerName="copy" containerID="cri-o://a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897" gracePeriod=2 Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.041625 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-fkmjb/must-gather-4rn4d"] Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.547127 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fkmjb_must-gather-4rn4d_9acef296-5e79-49cf-867a-124137b68d69/copy/0.log" Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.547835 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.594996 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tbbhj\" (UniqueName: \"kubernetes.io/projected/9acef296-5e79-49cf-867a-124137b68d69-kube-api-access-tbbhj\") pod \"9acef296-5e79-49cf-867a-124137b68d69\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.595113 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9acef296-5e79-49cf-867a-124137b68d69-must-gather-output\") pod \"9acef296-5e79-49cf-867a-124137b68d69\" (UID: \"9acef296-5e79-49cf-867a-124137b68d69\") " Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.602289 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9acef296-5e79-49cf-867a-124137b68d69-kube-api-access-tbbhj" (OuterVolumeSpecName: "kube-api-access-tbbhj") pod "9acef296-5e79-49cf-867a-124137b68d69" (UID: "9acef296-5e79-49cf-867a-124137b68d69"). InnerVolumeSpecName "kube-api-access-tbbhj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.697932 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tbbhj\" (UniqueName: \"kubernetes.io/projected/9acef296-5e79-49cf-867a-124137b68d69-kube-api-access-tbbhj\") on node \"crc\" DevicePath \"\"" Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.767804 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9acef296-5e79-49cf-867a-124137b68d69-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "9acef296-5e79-49cf-867a-124137b68d69" (UID: "9acef296-5e79-49cf-867a-124137b68d69"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:34:25 crc kubenswrapper[4765]: I0319 11:34:25.799778 4765 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/9acef296-5e79-49cf-867a-124137b68d69-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.008487 4765 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-fkmjb_must-gather-4rn4d_9acef296-5e79-49cf-867a-124137b68d69/copy/0.log" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.009007 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-fkmjb/must-gather-4rn4d" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.008950 4765 generic.go:334] "Generic (PLEG): container finished" podID="9acef296-5e79-49cf-867a-124137b68d69" containerID="a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897" exitCode=143 Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.009103 4765 scope.go:117] "RemoveContainer" containerID="a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.036389 4765 scope.go:117] "RemoveContainer" containerID="c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.098163 4765 scope.go:117] "RemoveContainer" containerID="a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897" Mar 19 11:34:26 crc kubenswrapper[4765]: E0319 11:34:26.098556 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897\": container with ID starting with a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897 not found: ID does not exist" containerID="a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.098605 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897"} err="failed to get container status \"a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897\": rpc error: code = NotFound desc = could not find container \"a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897\": container with ID starting with a39c901d8b22164045c9c8bc3099e8f3ffde9cb1271ab1e6058dcd354dc5e897 not found: ID does not exist" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.098633 4765 scope.go:117] "RemoveContainer" containerID="c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba" Mar 19 11:34:26 crc kubenswrapper[4765]: E0319 11:34:26.098908 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba\": container with ID starting with c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba not found: ID does not exist" containerID="c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.098945 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba"} err="failed to get container status \"c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba\": rpc error: code = NotFound desc = could not find container \"c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba\": container with ID starting with c001e3799f1edd8368719265ddd7f5397c8923fe18420780a16641a10cab45ba not found: ID does not exist" Mar 19 11:34:26 crc kubenswrapper[4765]: I0319 11:34:26.366854 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9acef296-5e79-49cf-867a-124137b68d69" path="/var/lib/kubelet/pods/9acef296-5e79-49cf-867a-124137b68d69/volumes" Mar 19 11:34:31 crc kubenswrapper[4765]: I0319 11:34:31.392985 4765 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:31 crc kubenswrapper[4765]: I0319 11:34:31.449731 4765 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:31 crc kubenswrapper[4765]: I0319 11:34:31.633932 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9npf"] Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.070130 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-d9npf" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="registry-server" containerID="cri-o://a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c" gracePeriod=2 Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.521924 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.655575 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-catalog-content\") pod \"29187046-f9f6-40fd-93cb-8db046fadbaf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.655674 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pk8xb\" (UniqueName: \"kubernetes.io/projected/29187046-f9f6-40fd-93cb-8db046fadbaf-kube-api-access-pk8xb\") pod \"29187046-f9f6-40fd-93cb-8db046fadbaf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.655932 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-utilities\") pod \"29187046-f9f6-40fd-93cb-8db046fadbaf\" (UID: \"29187046-f9f6-40fd-93cb-8db046fadbaf\") " Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.656682 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-utilities" (OuterVolumeSpecName: "utilities") pod "29187046-f9f6-40fd-93cb-8db046fadbaf" (UID: "29187046-f9f6-40fd-93cb-8db046fadbaf"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.661727 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/29187046-f9f6-40fd-93cb-8db046fadbaf-kube-api-access-pk8xb" (OuterVolumeSpecName: "kube-api-access-pk8xb") pod "29187046-f9f6-40fd-93cb-8db046fadbaf" (UID: "29187046-f9f6-40fd-93cb-8db046fadbaf"). InnerVolumeSpecName "kube-api-access-pk8xb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.686437 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "29187046-f9f6-40fd-93cb-8db046fadbaf" (UID: "29187046-f9f6-40fd-93cb-8db046fadbaf"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.758030 4765 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-utilities\") on node \"crc\" DevicePath \"\"" Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.758070 4765 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/29187046-f9f6-40fd-93cb-8db046fadbaf-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 19 11:34:33 crc kubenswrapper[4765]: I0319 11:34:33.758082 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pk8xb\" (UniqueName: \"kubernetes.io/projected/29187046-f9f6-40fd-93cb-8db046fadbaf-kube-api-access-pk8xb\") on node \"crc\" DevicePath \"\"" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.084701 4765 generic.go:334] "Generic (PLEG): container finished" podID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerID="a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c" exitCode=0 Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.084773 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-d9npf" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.084782 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9npf" event={"ID":"29187046-f9f6-40fd-93cb-8db046fadbaf","Type":"ContainerDied","Data":"a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c"} Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.085248 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-d9npf" event={"ID":"29187046-f9f6-40fd-93cb-8db046fadbaf","Type":"ContainerDied","Data":"14d77abc8b42b1e41ac42d93d9a6399c15009d9c47a56a98af77ffa2062f900d"} Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.085276 4765 scope.go:117] "RemoveContainer" containerID="a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.106065 4765 scope.go:117] "RemoveContainer" containerID="98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.124604 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9npf"] Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.133640 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-d9npf"] Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.145648 4765 scope.go:117] "RemoveContainer" containerID="515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.196135 4765 scope.go:117] "RemoveContainer" containerID="a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c" Mar 19 11:34:34 crc kubenswrapper[4765]: E0319 11:34:34.201574 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c\": container with ID starting with a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c not found: ID does not exist" containerID="a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.201630 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c"} err="failed to get container status \"a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c\": rpc error: code = NotFound desc = could not find container \"a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c\": container with ID starting with a45be99f33ebc17bd826030ef31194f666523485e059a9d5d0ab16625cf1101c not found: ID does not exist" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.201659 4765 scope.go:117] "RemoveContainer" containerID="98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3" Mar 19 11:34:34 crc kubenswrapper[4765]: E0319 11:34:34.202061 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3\": container with ID starting with 98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3 not found: ID does not exist" containerID="98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.202168 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3"} err="failed to get container status \"98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3\": rpc error: code = NotFound desc = could not find container \"98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3\": container with ID starting with 98450815f0b9c229ea0b4997389083d41402d5b93fcc804ef194fd0e20d9e9d3 not found: ID does not exist" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.202256 4765 scope.go:117] "RemoveContainer" containerID="515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549" Mar 19 11:34:34 crc kubenswrapper[4765]: E0319 11:34:34.202610 4765 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549\": container with ID starting with 515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549 not found: ID does not exist" containerID="515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.202658 4765 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549"} err="failed to get container status \"515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549\": rpc error: code = NotFound desc = could not find container \"515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549\": container with ID starting with 515d8a6cc2dd78dcd7aca384692e7df036b195bb744254137c79bc5a42783549 not found: ID does not exist" Mar 19 11:34:34 crc kubenswrapper[4765]: I0319 11:34:34.367119 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" path="/var/lib/kubelet/pods/29187046-f9f6-40fd-93cb-8db046fadbaf/volumes" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.149802 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565336-nvjpn"] Mar 19 11:36:00 crc kubenswrapper[4765]: E0319 11:36:00.150976 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="extract-utilities" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.150994 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="extract-utilities" Mar 19 11:36:00 crc kubenswrapper[4765]: E0319 11:36:00.151020 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="registry-server" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.151028 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="registry-server" Mar 19 11:36:00 crc kubenswrapper[4765]: E0319 11:36:00.151048 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acef296-5e79-49cf-867a-124137b68d69" containerName="copy" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.151055 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acef296-5e79-49cf-867a-124137b68d69" containerName="copy" Mar 19 11:36:00 crc kubenswrapper[4765]: E0319 11:36:00.151078 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9acef296-5e79-49cf-867a-124137b68d69" containerName="gather" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.151085 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="9acef296-5e79-49cf-867a-124137b68d69" containerName="gather" Mar 19 11:36:00 crc kubenswrapper[4765]: E0319 11:36:00.151113 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="extract-content" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.151122 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="extract-content" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.151335 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acef296-5e79-49cf-867a-124137b68d69" containerName="gather" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.151365 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="29187046-f9f6-40fd-93cb-8db046fadbaf" containerName="registry-server" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.151377 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="9acef296-5e79-49cf-867a-124137b68d69" containerName="copy" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.152187 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565336-nvjpn" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.155517 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.155618 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.160493 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.160889 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565336-nvjpn"] Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.303014 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gch7m\" (UniqueName: \"kubernetes.io/projected/d0fe2d00-bc9e-412d-b076-46035c5cf844-kube-api-access-gch7m\") pod \"auto-csr-approver-29565336-nvjpn\" (UID: \"d0fe2d00-bc9e-412d-b076-46035c5cf844\") " pod="openshift-infra/auto-csr-approver-29565336-nvjpn" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.405557 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gch7m\" (UniqueName: \"kubernetes.io/projected/d0fe2d00-bc9e-412d-b076-46035c5cf844-kube-api-access-gch7m\") pod \"auto-csr-approver-29565336-nvjpn\" (UID: \"d0fe2d00-bc9e-412d-b076-46035c5cf844\") " pod="openshift-infra/auto-csr-approver-29565336-nvjpn" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.427670 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gch7m\" (UniqueName: \"kubernetes.io/projected/d0fe2d00-bc9e-412d-b076-46035c5cf844-kube-api-access-gch7m\") pod \"auto-csr-approver-29565336-nvjpn\" (UID: \"d0fe2d00-bc9e-412d-b076-46035c5cf844\") " pod="openshift-infra/auto-csr-approver-29565336-nvjpn" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.476131 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565336-nvjpn" Mar 19 11:36:00 crc kubenswrapper[4765]: I0319 11:36:00.968281 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565336-nvjpn"] Mar 19 11:36:01 crc kubenswrapper[4765]: I0319 11:36:01.655976 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:36:01 crc kubenswrapper[4765]: I0319 11:36:01.656045 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:36:01 crc kubenswrapper[4765]: I0319 11:36:01.874094 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565336-nvjpn" event={"ID":"d0fe2d00-bc9e-412d-b076-46035c5cf844","Type":"ContainerStarted","Data":"6a404dd4eae2eed77bbab82ca42f3b48ca62853ccdba231f19d4ded767c0d65b"} Mar 19 11:36:02 crc kubenswrapper[4765]: I0319 11:36:02.885111 4765 generic.go:334] "Generic (PLEG): container finished" podID="d0fe2d00-bc9e-412d-b076-46035c5cf844" containerID="3b7e8af4fd53b68693cfb467a1312beff008f5f5af843e398d665787e8c51b5b" exitCode=0 Mar 19 11:36:02 crc kubenswrapper[4765]: I0319 11:36:02.885234 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565336-nvjpn" event={"ID":"d0fe2d00-bc9e-412d-b076-46035c5cf844","Type":"ContainerDied","Data":"3b7e8af4fd53b68693cfb467a1312beff008f5f5af843e398d665787e8c51b5b"} Mar 19 11:36:04 crc kubenswrapper[4765]: I0319 11:36:04.243316 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565336-nvjpn" Mar 19 11:36:04 crc kubenswrapper[4765]: I0319 11:36:04.382174 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gch7m\" (UniqueName: \"kubernetes.io/projected/d0fe2d00-bc9e-412d-b076-46035c5cf844-kube-api-access-gch7m\") pod \"d0fe2d00-bc9e-412d-b076-46035c5cf844\" (UID: \"d0fe2d00-bc9e-412d-b076-46035c5cf844\") " Mar 19 11:36:04 crc kubenswrapper[4765]: I0319 11:36:04.788529 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0fe2d00-bc9e-412d-b076-46035c5cf844-kube-api-access-gch7m" (OuterVolumeSpecName: "kube-api-access-gch7m") pod "d0fe2d00-bc9e-412d-b076-46035c5cf844" (UID: "d0fe2d00-bc9e-412d-b076-46035c5cf844"). InnerVolumeSpecName "kube-api-access-gch7m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:36:04 crc kubenswrapper[4765]: I0319 11:36:04.793929 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gch7m\" (UniqueName: \"kubernetes.io/projected/d0fe2d00-bc9e-412d-b076-46035c5cf844-kube-api-access-gch7m\") on node \"crc\" DevicePath \"\"" Mar 19 11:36:04 crc kubenswrapper[4765]: I0319 11:36:04.901810 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565336-nvjpn" event={"ID":"d0fe2d00-bc9e-412d-b076-46035c5cf844","Type":"ContainerDied","Data":"6a404dd4eae2eed77bbab82ca42f3b48ca62853ccdba231f19d4ded767c0d65b"} Mar 19 11:36:04 crc kubenswrapper[4765]: I0319 11:36:04.901850 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6a404dd4eae2eed77bbab82ca42f3b48ca62853ccdba231f19d4ded767c0d65b" Mar 19 11:36:04 crc kubenswrapper[4765]: I0319 11:36:04.901898 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565336-nvjpn" Mar 19 11:36:05 crc kubenswrapper[4765]: I0319 11:36:05.314189 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565330-2kgwr"] Mar 19 11:36:05 crc kubenswrapper[4765]: I0319 11:36:05.325482 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565330-2kgwr"] Mar 19 11:36:06 crc kubenswrapper[4765]: I0319 11:36:06.365719 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="32da5e9d-72fb-43b8-903f-67dcbb0187bf" path="/var/lib/kubelet/pods/32da5e9d-72fb-43b8-903f-67dcbb0187bf/volumes" Mar 19 11:36:17 crc kubenswrapper[4765]: I0319 11:36:17.042302 4765 scope.go:117] "RemoveContainer" containerID="856850e4b24f916e772618b6af943d3b48262190519902ac549d6c3f7286b494" Mar 19 11:36:31 crc kubenswrapper[4765]: I0319 11:36:31.656674 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:36:31 crc kubenswrapper[4765]: I0319 11:36:31.657751 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:37:01 crc kubenswrapper[4765]: I0319 11:37:01.656989 4765 patch_prober.go:28] interesting pod/machine-config-daemon-4sj5l container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 19 11:37:01 crc kubenswrapper[4765]: I0319 11:37:01.657664 4765 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 19 11:37:01 crc kubenswrapper[4765]: I0319 11:37:01.657741 4765 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" Mar 19 11:37:01 crc kubenswrapper[4765]: I0319 11:37:01.658722 4765 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1"} pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 19 11:37:01 crc kubenswrapper[4765]: I0319 11:37:01.658806 4765 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerName="machine-config-daemon" containerID="cri-o://2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" gracePeriod=600 Mar 19 11:37:01 crc kubenswrapper[4765]: E0319 11:37:01.780847 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:37:02 crc kubenswrapper[4765]: I0319 11:37:02.096309 4765 generic.go:334] "Generic (PLEG): container finished" podID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" exitCode=0 Mar 19 11:37:02 crc kubenswrapper[4765]: I0319 11:37:02.096389 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" event={"ID":"a7d72ad1-7f25-4580-b845-7f66e8f78bff","Type":"ContainerDied","Data":"2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1"} Mar 19 11:37:02 crc kubenswrapper[4765]: I0319 11:37:02.096459 4765 scope.go:117] "RemoveContainer" containerID="d10207e38e3d71513e5ccfc9b07c224dac480f88917cc773ac045cc45750c785" Mar 19 11:37:02 crc kubenswrapper[4765]: I0319 11:37:02.097829 4765 scope.go:117] "RemoveContainer" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" Mar 19 11:37:02 crc kubenswrapper[4765]: E0319 11:37:02.098512 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:37:14 crc kubenswrapper[4765]: I0319 11:37:14.356855 4765 scope.go:117] "RemoveContainer" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" Mar 19 11:37:14 crc kubenswrapper[4765]: E0319 11:37:14.357645 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:37:27 crc kubenswrapper[4765]: I0319 11:37:27.357285 4765 scope.go:117] "RemoveContainer" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" Mar 19 11:37:27 crc kubenswrapper[4765]: E0319 11:37:27.358815 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:37:39 crc kubenswrapper[4765]: I0319 11:37:39.356251 4765 scope.go:117] "RemoveContainer" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" Mar 19 11:37:39 crc kubenswrapper[4765]: E0319 11:37:39.357158 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:37:54 crc kubenswrapper[4765]: I0319 11:37:54.357308 4765 scope.go:117] "RemoveContainer" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" Mar 19 11:37:54 crc kubenswrapper[4765]: E0319 11:37:54.358529 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.152827 4765 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29565338-5jtt9"] Mar 19 11:38:00 crc kubenswrapper[4765]: E0319 11:38:00.153872 4765 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0fe2d00-bc9e-412d-b076-46035c5cf844" containerName="oc" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.153890 4765 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0fe2d00-bc9e-412d-b076-46035c5cf844" containerName="oc" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.154184 4765 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0fe2d00-bc9e-412d-b076-46035c5cf844" containerName="oc" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.155033 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565338-5jtt9" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.157517 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.157808 4765 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-rnmqm" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.161481 4765 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.166819 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565338-5jtt9"] Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.203833 4765 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kjfq\" (UniqueName: \"kubernetes.io/projected/bbbcb374-9757-492a-a279-7ad55393373e-kube-api-access-6kjfq\") pod \"auto-csr-approver-29565338-5jtt9\" (UID: \"bbbcb374-9757-492a-a279-7ad55393373e\") " pod="openshift-infra/auto-csr-approver-29565338-5jtt9" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.305451 4765 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6kjfq\" (UniqueName: \"kubernetes.io/projected/bbbcb374-9757-492a-a279-7ad55393373e-kube-api-access-6kjfq\") pod \"auto-csr-approver-29565338-5jtt9\" (UID: \"bbbcb374-9757-492a-a279-7ad55393373e\") " pod="openshift-infra/auto-csr-approver-29565338-5jtt9" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.327258 4765 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kjfq\" (UniqueName: \"kubernetes.io/projected/bbbcb374-9757-492a-a279-7ad55393373e-kube-api-access-6kjfq\") pod \"auto-csr-approver-29565338-5jtt9\" (UID: \"bbbcb374-9757-492a-a279-7ad55393373e\") " pod="openshift-infra/auto-csr-approver-29565338-5jtt9" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.475498 4765 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565338-5jtt9" Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.934880 4765 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29565338-5jtt9"] Mar 19 11:38:00 crc kubenswrapper[4765]: I0319 11:38:00.944110 4765 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 19 11:38:01 crc kubenswrapper[4765]: I0319 11:38:01.640732 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565338-5jtt9" event={"ID":"bbbcb374-9757-492a-a279-7ad55393373e","Type":"ContainerStarted","Data":"3770fa7d56b92c04c14fe98087b71d02d0d99cd79920e690d39219e24e1980b5"} Mar 19 11:38:03 crc kubenswrapper[4765]: I0319 11:38:03.667529 4765 generic.go:334] "Generic (PLEG): container finished" podID="bbbcb374-9757-492a-a279-7ad55393373e" containerID="2baa2761f78908a621409b954ce977b73029d20c04fd359cb3b91145702e9add" exitCode=0 Mar 19 11:38:03 crc kubenswrapper[4765]: I0319 11:38:03.667642 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565338-5jtt9" event={"ID":"bbbcb374-9757-492a-a279-7ad55393373e","Type":"ContainerDied","Data":"2baa2761f78908a621409b954ce977b73029d20c04fd359cb3b91145702e9add"} Mar 19 11:38:05 crc kubenswrapper[4765]: I0319 11:38:05.100249 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565338-5jtt9" Mar 19 11:38:05 crc kubenswrapper[4765]: I0319 11:38:05.212020 4765 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6kjfq\" (UniqueName: \"kubernetes.io/projected/bbbcb374-9757-492a-a279-7ad55393373e-kube-api-access-6kjfq\") pod \"bbbcb374-9757-492a-a279-7ad55393373e\" (UID: \"bbbcb374-9757-492a-a279-7ad55393373e\") " Mar 19 11:38:05 crc kubenswrapper[4765]: I0319 11:38:05.218638 4765 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bbbcb374-9757-492a-a279-7ad55393373e-kube-api-access-6kjfq" (OuterVolumeSpecName: "kube-api-access-6kjfq") pod "bbbcb374-9757-492a-a279-7ad55393373e" (UID: "bbbcb374-9757-492a-a279-7ad55393373e"). InnerVolumeSpecName "kube-api-access-6kjfq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 19 11:38:05 crc kubenswrapper[4765]: I0319 11:38:05.314294 4765 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6kjfq\" (UniqueName: \"kubernetes.io/projected/bbbcb374-9757-492a-a279-7ad55393373e-kube-api-access-6kjfq\") on node \"crc\" DevicePath \"\"" Mar 19 11:38:05 crc kubenswrapper[4765]: I0319 11:38:05.698569 4765 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29565338-5jtt9" event={"ID":"bbbcb374-9757-492a-a279-7ad55393373e","Type":"ContainerDied","Data":"3770fa7d56b92c04c14fe98087b71d02d0d99cd79920e690d39219e24e1980b5"} Mar 19 11:38:05 crc kubenswrapper[4765]: I0319 11:38:05.698615 4765 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3770fa7d56b92c04c14fe98087b71d02d0d99cd79920e690d39219e24e1980b5" Mar 19 11:38:05 crc kubenswrapper[4765]: I0319 11:38:05.698689 4765 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29565338-5jtt9" Mar 19 11:38:06 crc kubenswrapper[4765]: I0319 11:38:06.171311 4765 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29565332-k5j8m"] Mar 19 11:38:06 crc kubenswrapper[4765]: I0319 11:38:06.178787 4765 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29565332-k5j8m"] Mar 19 11:38:06 crc kubenswrapper[4765]: I0319 11:38:06.368638 4765 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2de0d93-78ae-47be-94a0-29c967a62a6b" path="/var/lib/kubelet/pods/a2de0d93-78ae-47be-94a0-29c967a62a6b/volumes" Mar 19 11:38:08 crc kubenswrapper[4765]: I0319 11:38:08.360093 4765 scope.go:117] "RemoveContainer" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" Mar 19 11:38:08 crc kubenswrapper[4765]: E0319 11:38:08.360732 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff" Mar 19 11:38:17 crc kubenswrapper[4765]: I0319 11:38:17.128940 4765 scope.go:117] "RemoveContainer" containerID="8c214342dac72475636a42bad146929000979cdf34f5e47daa7af5009d771684" Mar 19 11:38:20 crc kubenswrapper[4765]: I0319 11:38:20.355938 4765 scope.go:117] "RemoveContainer" containerID="2332e590e381d3982e6c429d348a534b80d1d8b7e3e4f2502733427a4c093dc1" Mar 19 11:38:20 crc kubenswrapper[4765]: E0319 11:38:20.356660 4765 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-4sj5l_openshift-machine-config-operator(a7d72ad1-7f25-4580-b845-7f66e8f78bff)\"" pod="openshift-machine-config-operator/machine-config-daemon-4sj5l" podUID="a7d72ad1-7f25-4580-b845-7f66e8f78bff"